Oct 07 19:00:10 crc systemd[1]: Starting Kubernetes Kubelet... Oct 07 19:00:10 crc restorecon[4681]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:10 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 19:00:11 crc restorecon[4681]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 19:00:11 crc restorecon[4681]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 07 19:00:11 crc kubenswrapper[4825]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 19:00:11 crc kubenswrapper[4825]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 07 19:00:11 crc kubenswrapper[4825]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 19:00:11 crc kubenswrapper[4825]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 19:00:11 crc kubenswrapper[4825]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 07 19:00:11 crc kubenswrapper[4825]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.554052 4825 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.559666 4825 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.559688 4825 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.559695 4825 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.559700 4825 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.559705 4825 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.559712 4825 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.559721 4825 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.559727 4825 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.559732 4825 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.559737 4825 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.559743 4825 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.559748 4825 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.559754 4825 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.559760 4825 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.559766 4825 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.559772 4825 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.559777 4825 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.559780 4825 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.559785 4825 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.559791 4825 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560094 4825 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560103 4825 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560108 4825 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560112 4825 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560116 4825 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560120 4825 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560124 4825 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560127 4825 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560132 4825 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560135 4825 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560139 4825 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560144 4825 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560150 4825 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560154 4825 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560157 4825 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560162 4825 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560166 4825 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560170 4825 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560174 4825 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560177 4825 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560181 4825 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560187 4825 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560192 4825 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560197 4825 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560205 4825 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560215 4825 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560221 4825 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560226 4825 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560249 4825 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560256 4825 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560263 4825 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560269 4825 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560274 4825 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560280 4825 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560286 4825 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560291 4825 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560296 4825 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560300 4825 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560304 4825 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560309 4825 feature_gate.go:330] unrecognized feature gate: Example Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560314 4825 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560318 4825 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560323 4825 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560331 4825 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560335 4825 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560340 4825 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560345 4825 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560350 4825 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560357 4825 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560362 4825 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.560366 4825 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560477 4825 flags.go:64] FLAG: --address="0.0.0.0" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560488 4825 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560501 4825 flags.go:64] FLAG: --anonymous-auth="true" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560508 4825 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560514 4825 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560518 4825 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560525 4825 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560531 4825 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560536 4825 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560540 4825 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560546 4825 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560552 4825 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560558 4825 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560565 4825 flags.go:64] FLAG: --cgroup-root="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560570 4825 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560575 4825 flags.go:64] FLAG: --client-ca-file="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560579 4825 flags.go:64] FLAG: --cloud-config="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560585 4825 flags.go:64] FLAG: --cloud-provider="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560590 4825 flags.go:64] FLAG: --cluster-dns="[]" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560598 4825 flags.go:64] FLAG: --cluster-domain="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560603 4825 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560607 4825 flags.go:64] FLAG: --config-dir="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560612 4825 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560616 4825 flags.go:64] FLAG: --container-log-max-files="5" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560622 4825 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560628 4825 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560632 4825 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560638 4825 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560642 4825 flags.go:64] FLAG: --contention-profiling="false" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560648 4825 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560653 4825 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560659 4825 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560665 4825 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560672 4825 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560678 4825 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560683 4825 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560688 4825 flags.go:64] FLAG: --enable-load-reader="false" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560692 4825 flags.go:64] FLAG: --enable-server="true" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560697 4825 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560702 4825 flags.go:64] FLAG: --event-burst="100" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560709 4825 flags.go:64] FLAG: --event-qps="50" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560714 4825 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560718 4825 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560723 4825 flags.go:64] FLAG: --eviction-hard="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560728 4825 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560733 4825 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560737 4825 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560742 4825 flags.go:64] FLAG: --eviction-soft="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560746 4825 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560750 4825 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560754 4825 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560758 4825 flags.go:64] FLAG: --experimental-mounter-path="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560762 4825 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560767 4825 flags.go:64] FLAG: --fail-swap-on="true" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560771 4825 flags.go:64] FLAG: --feature-gates="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560777 4825 flags.go:64] FLAG: --file-check-frequency="20s" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560781 4825 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560786 4825 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560790 4825 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560795 4825 flags.go:64] FLAG: --healthz-port="10248" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560799 4825 flags.go:64] FLAG: --help="false" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560804 4825 flags.go:64] FLAG: --hostname-override="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560808 4825 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560813 4825 flags.go:64] FLAG: --http-check-frequency="20s" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560817 4825 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560821 4825 flags.go:64] FLAG: --image-credential-provider-config="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560826 4825 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560830 4825 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560834 4825 flags.go:64] FLAG: --image-service-endpoint="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560838 4825 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560842 4825 flags.go:64] FLAG: --kube-api-burst="100" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560846 4825 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560852 4825 flags.go:64] FLAG: --kube-api-qps="50" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560856 4825 flags.go:64] FLAG: --kube-reserved="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560861 4825 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560867 4825 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560872 4825 flags.go:64] FLAG: --kubelet-cgroups="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560877 4825 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560881 4825 flags.go:64] FLAG: --lock-file="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560885 4825 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560890 4825 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560895 4825 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560902 4825 flags.go:64] FLAG: --log-json-split-stream="false" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560908 4825 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560912 4825 flags.go:64] FLAG: --log-text-split-stream="false" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560917 4825 flags.go:64] FLAG: --logging-format="text" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560921 4825 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560926 4825 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560930 4825 flags.go:64] FLAG: --manifest-url="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560934 4825 flags.go:64] FLAG: --manifest-url-header="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560941 4825 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560945 4825 flags.go:64] FLAG: --max-open-files="1000000" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560951 4825 flags.go:64] FLAG: --max-pods="110" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560956 4825 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560960 4825 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560965 4825 flags.go:64] FLAG: --memory-manager-policy="None" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560969 4825 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560973 4825 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560977 4825 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560981 4825 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560994 4825 flags.go:64] FLAG: --node-status-max-images="50" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.560999 4825 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561003 4825 flags.go:64] FLAG: --oom-score-adj="-999" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561007 4825 flags.go:64] FLAG: --pod-cidr="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561011 4825 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561018 4825 flags.go:64] FLAG: --pod-manifest-path="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561022 4825 flags.go:64] FLAG: --pod-max-pids="-1" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561026 4825 flags.go:64] FLAG: --pods-per-core="0" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561030 4825 flags.go:64] FLAG: --port="10250" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561034 4825 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561038 4825 flags.go:64] FLAG: --provider-id="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561042 4825 flags.go:64] FLAG: --qos-reserved="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561046 4825 flags.go:64] FLAG: --read-only-port="10255" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561050 4825 flags.go:64] FLAG: --register-node="true" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561054 4825 flags.go:64] FLAG: --register-schedulable="true" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561058 4825 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561065 4825 flags.go:64] FLAG: --registry-burst="10" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561069 4825 flags.go:64] FLAG: --registry-qps="5" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561073 4825 flags.go:64] FLAG: --reserved-cpus="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561080 4825 flags.go:64] FLAG: --reserved-memory="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561086 4825 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561091 4825 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561095 4825 flags.go:64] FLAG: --rotate-certificates="false" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561099 4825 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561103 4825 flags.go:64] FLAG: --runonce="false" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561112 4825 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561116 4825 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561121 4825 flags.go:64] FLAG: --seccomp-default="false" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561125 4825 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561129 4825 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561133 4825 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561137 4825 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561141 4825 flags.go:64] FLAG: --storage-driver-password="root" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561145 4825 flags.go:64] FLAG: --storage-driver-secure="false" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561149 4825 flags.go:64] FLAG: --storage-driver-table="stats" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561153 4825 flags.go:64] FLAG: --storage-driver-user="root" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561157 4825 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561163 4825 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561167 4825 flags.go:64] FLAG: --system-cgroups="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561171 4825 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561178 4825 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561183 4825 flags.go:64] FLAG: --tls-cert-file="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561188 4825 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561196 4825 flags.go:64] FLAG: --tls-min-version="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561201 4825 flags.go:64] FLAG: --tls-private-key-file="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561206 4825 flags.go:64] FLAG: --topology-manager-policy="none" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561211 4825 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561218 4825 flags.go:64] FLAG: --topology-manager-scope="container" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561223 4825 flags.go:64] FLAG: --v="2" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561250 4825 flags.go:64] FLAG: --version="false" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561257 4825 flags.go:64] FLAG: --vmodule="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561270 4825 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.561275 4825 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562125 4825 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562134 4825 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562139 4825 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562143 4825 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562149 4825 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562153 4825 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562157 4825 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562161 4825 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562164 4825 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562168 4825 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562172 4825 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562175 4825 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562179 4825 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562183 4825 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562187 4825 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562190 4825 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562197 4825 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562200 4825 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562204 4825 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562207 4825 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562211 4825 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562215 4825 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562219 4825 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562222 4825 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562229 4825 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562255 4825 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562260 4825 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562264 4825 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562268 4825 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562274 4825 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562278 4825 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562282 4825 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562286 4825 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562290 4825 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562294 4825 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562298 4825 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562301 4825 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562305 4825 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562311 4825 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562315 4825 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562319 4825 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562323 4825 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562326 4825 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562329 4825 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562333 4825 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562336 4825 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562340 4825 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562344 4825 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562349 4825 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562353 4825 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562356 4825 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562360 4825 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562365 4825 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562369 4825 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562373 4825 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562377 4825 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562381 4825 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562385 4825 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562389 4825 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562393 4825 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562396 4825 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562400 4825 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562403 4825 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562407 4825 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562411 4825 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562416 4825 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562421 4825 feature_gate.go:330] unrecognized feature gate: Example Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562426 4825 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562431 4825 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562437 4825 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.562441 4825 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.562455 4825 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.573402 4825 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.573463 4825 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573605 4825 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573620 4825 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573631 4825 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573642 4825 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573652 4825 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573661 4825 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573670 4825 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573681 4825 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573695 4825 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573705 4825 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573715 4825 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573725 4825 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573734 4825 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573744 4825 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573754 4825 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573763 4825 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573772 4825 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573781 4825 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573790 4825 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573799 4825 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573807 4825 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573816 4825 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573824 4825 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573833 4825 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573841 4825 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573850 4825 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573858 4825 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573866 4825 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573875 4825 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573887 4825 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573896 4825 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573909 4825 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573920 4825 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573930 4825 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573941 4825 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573952 4825 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573961 4825 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573972 4825 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573981 4825 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573990 4825 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.573999 4825 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574007 4825 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574016 4825 feature_gate.go:330] unrecognized feature gate: Example Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574024 4825 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574033 4825 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574041 4825 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574049 4825 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574058 4825 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574066 4825 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574075 4825 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574084 4825 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574092 4825 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574100 4825 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574108 4825 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574117 4825 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574125 4825 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574133 4825 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574142 4825 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574150 4825 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574158 4825 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574167 4825 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574176 4825 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574184 4825 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574194 4825 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574203 4825 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574215 4825 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574255 4825 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574266 4825 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574275 4825 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574283 4825 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574292 4825 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.574308 4825 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574668 4825 feature_gate.go:330] unrecognized feature gate: Example Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574686 4825 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574696 4825 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574708 4825 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574719 4825 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574729 4825 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574740 4825 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574751 4825 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574762 4825 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574773 4825 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574784 4825 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574794 4825 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574803 4825 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574812 4825 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574821 4825 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574832 4825 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574842 4825 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574853 4825 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574864 4825 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574874 4825 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574885 4825 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574895 4825 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574909 4825 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574928 4825 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574941 4825 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574953 4825 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574966 4825 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574978 4825 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.574989 4825 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575004 4825 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575015 4825 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575026 4825 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575038 4825 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575048 4825 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575058 4825 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575069 4825 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575080 4825 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575090 4825 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575104 4825 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575118 4825 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575131 4825 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575142 4825 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575153 4825 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575163 4825 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575175 4825 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575187 4825 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575199 4825 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575210 4825 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575220 4825 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575267 4825 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575280 4825 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575292 4825 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575302 4825 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575312 4825 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575324 4825 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575335 4825 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575347 4825 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575357 4825 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575367 4825 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575377 4825 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575388 4825 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575398 4825 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575407 4825 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575416 4825 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575424 4825 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575439 4825 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575451 4825 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575489 4825 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575499 4825 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575508 4825 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.575516 4825 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.575530 4825 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.575827 4825 server.go:940] "Client rotation is on, will bootstrap in background" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.581902 4825 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.582043 4825 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.583706 4825 server.go:997] "Starting client certificate rotation" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.583747 4825 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.583938 4825 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-12 15:56:32.84145641 +0000 UTC Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.584069 4825 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1580h56m21.257390105s for next certificate rotation Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.608894 4825 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.611717 4825 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.630858 4825 log.go:25] "Validated CRI v1 runtime API" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.669316 4825 log.go:25] "Validated CRI v1 image API" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.671795 4825 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.678666 4825 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-07-16-57-21-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.678722 4825 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.704369 4825 manager.go:217] Machine: {Timestamp:2025-10-07 19:00:11.701278424 +0000 UTC m=+0.523317141 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654116352 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:da8b2757-4bf3-4b55-84bb-69d70219b543 BootID:951f58e0-4df3-42e3-a827-d82d183370bf Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108168 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827056128 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:60:1d:79 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:60:1d:79 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:df:49:43 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:99:bc:3c Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:ef:a8:82 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:47:fc:1d Speed:-1 Mtu:1496} {Name:eth10 MacAddress:e6:b4:78:f0:ba:10 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:4a:58:27:ec:17:51 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654116352 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.704805 4825 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.705047 4825 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.705853 4825 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.706379 4825 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.706458 4825 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.708068 4825 topology_manager.go:138] "Creating topology manager with none policy" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.708105 4825 container_manager_linux.go:303] "Creating device plugin manager" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.708602 4825 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.708668 4825 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.709104 4825 state_mem.go:36] "Initialized new in-memory state store" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.709487 4825 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.712726 4825 kubelet.go:418] "Attempting to sync node with API server" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.712773 4825 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.712802 4825 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.712823 4825 kubelet.go:324] "Adding apiserver pod source" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.712843 4825 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.717384 4825 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.718579 4825 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.720181 4825 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.720745 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.720751 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Oct 07 19:00:11 crc kubenswrapper[4825]: E1007 19:00:11.721014 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.45:6443: connect: connection refused" logger="UnhandledError" Oct 07 19:00:11 crc kubenswrapper[4825]: E1007 19:00:11.720925 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.45:6443: connect: connection refused" logger="UnhandledError" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.721967 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.722012 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.722027 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.722042 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.722064 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.722078 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.722092 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.722116 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.722136 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.722150 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.722172 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.722185 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.722969 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.723728 4825 server.go:1280] "Started kubelet" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.724992 4825 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.725121 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.725867 4825 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 07 19:00:11 crc systemd[1]: Started Kubernetes Kubelet. Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.725737 4825 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.727820 4825 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.727908 4825 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.727926 4825 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 06:11:09.82202475 +0000 UTC Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.727996 4825 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1547h10m58.094031048s for next certificate rotation Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.728757 4825 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.728795 4825 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 07 19:00:11 crc kubenswrapper[4825]: E1007 19:00:11.728951 4825 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.728996 4825 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.730110 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Oct 07 19:00:11 crc kubenswrapper[4825]: E1007 19:00:11.730256 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.45:6443: connect: connection refused" logger="UnhandledError" Oct 07 19:00:11 crc kubenswrapper[4825]: E1007 19:00:11.732644 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" interval="200ms" Oct 07 19:00:11 crc kubenswrapper[4825]: E1007 19:00:11.732589 4825 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.45:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186c4aa175ed18e0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-07 19:00:11.723684064 +0000 UTC m=+0.545722741,LastTimestamp:2025-10-07 19:00:11.723684064 +0000 UTC m=+0.545722741,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.734138 4825 factory.go:153] Registering CRI-O factory Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.734216 4825 factory.go:221] Registration of the crio container factory successfully Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.734337 4825 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.734350 4825 factory.go:55] Registering systemd factory Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.734360 4825 factory.go:221] Registration of the systemd container factory successfully Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.734402 4825 factory.go:103] Registering Raw factory Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.738192 4825 manager.go:1196] Started watching for new ooms in manager Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.740321 4825 manager.go:319] Starting recovery of all containers Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.750148 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.750305 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.750331 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.750363 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.750384 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.750403 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.750426 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.750446 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.750471 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.750493 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.750512 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.750531 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.750555 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.750590 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.750611 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.750633 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.750660 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.750680 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.750705 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.750730 4825 server.go:460] "Adding debug handlers to kubelet server" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.750738 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.752392 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.752465 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.752489 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.752509 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.752528 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.752560 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.752590 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.752614 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.752636 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.752655 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.752716 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.752739 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.754914 4825 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.754987 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755035 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755082 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755122 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755154 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755181 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755218 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755289 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755322 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755352 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755385 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755419 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755453 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755481 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755514 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755611 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755645 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755676 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755728 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755754 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755790 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755832 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755862 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755893 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755920 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755951 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.755975 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756000 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756022 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756042 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756062 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756081 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756100 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756131 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756148 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756167 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756186 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756205 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756223 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756281 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756298 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756317 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756336 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756356 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756373 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756393 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756412 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756430 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756448 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756505 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756525 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756545 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756565 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756587 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756607 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756628 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756649 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756673 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756692 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756711 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756732 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756750 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756771 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756790 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756810 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756829 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756849 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756869 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756893 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756913 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756933 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756952 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.756982 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757008 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757030 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757051 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757071 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757094 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757117 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757140 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757161 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757181 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757201 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757221 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757269 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757343 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757365 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757386 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757408 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757430 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757449 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757468 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757490 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757512 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757535 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757554 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757575 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757606 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757631 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757654 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757682 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757708 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757729 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757748 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757766 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757787 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757805 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757825 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757846 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757865 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757886 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757905 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757926 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757945 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757964 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.757981 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758002 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758022 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758043 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758063 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758086 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758108 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758126 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758147 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758166 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758185 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758205 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758224 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758275 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758297 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758316 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758335 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758355 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758374 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758394 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758413 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758432 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758452 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758471 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758490 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758511 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758538 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758556 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758577 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758597 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758616 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758637 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758658 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758678 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758697 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758728 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758746 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758765 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758786 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758804 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758824 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758843 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758866 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758884 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758904 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758921 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758941 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758961 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758981 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.758999 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.759019 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.759036 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.759055 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.759074 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.759093 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.759111 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.759132 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.759151 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.759169 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.759185 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.759204 4825 reconstruct.go:97] "Volume reconstruction finished" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.759217 4825 reconciler.go:26] "Reconciler: start to sync state" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.761646 4825 manager.go:324] Recovery completed Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.777013 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.782774 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.782856 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.782875 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.785362 4825 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.785391 4825 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.785421 4825 state_mem.go:36] "Initialized new in-memory state store" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.790733 4825 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.793013 4825 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.794001 4825 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.794081 4825 kubelet.go:2335] "Starting kubelet main sync loop" Oct 07 19:00:11 crc kubenswrapper[4825]: E1007 19:00:11.794188 4825 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 07 19:00:11 crc kubenswrapper[4825]: W1007 19:00:11.800145 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Oct 07 19:00:11 crc kubenswrapper[4825]: E1007 19:00:11.800246 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.45:6443: connect: connection refused" logger="UnhandledError" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.803584 4825 policy_none.go:49] "None policy: Start" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.807002 4825 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.807056 4825 state_mem.go:35] "Initializing new in-memory state store" Oct 07 19:00:11 crc kubenswrapper[4825]: E1007 19:00:11.829250 4825 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.871932 4825 manager.go:334] "Starting Device Plugin manager" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.871995 4825 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.872012 4825 server.go:79] "Starting device plugin registration server" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.872558 4825 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.872597 4825 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.872743 4825 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.873017 4825 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.873050 4825 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 07 19:00:11 crc kubenswrapper[4825]: E1007 19:00:11.887923 4825 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.895155 4825 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.895302 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.896897 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.896973 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.896999 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.897351 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.897732 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.897810 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.899111 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.899168 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.899178 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.899192 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.899285 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.899380 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.899706 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.899995 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.900110 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.901519 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.901536 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.901591 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.901624 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.901632 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.901653 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.901840 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.902104 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.902191 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.903261 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.903291 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.903305 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.903416 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.903452 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.903485 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.903500 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.903598 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.903639 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.904643 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.904703 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.904717 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.904758 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.904802 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.904817 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.904953 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.905025 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.906057 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.906093 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.906107 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:11 crc kubenswrapper[4825]: E1007 19:00:11.935369 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" interval="400ms" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.961577 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.961702 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.961765 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.961803 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.961841 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.961875 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.961938 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.962013 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.962101 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.962253 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.962324 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.962357 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.962388 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.962421 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.962457 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.972989 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.974752 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.974814 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.974830 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:11 crc kubenswrapper[4825]: I1007 19:00:11.974874 4825 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 19:00:11 crc kubenswrapper[4825]: E1007 19:00:11.975552 4825 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.45:6443: connect: connection refused" node="crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.063769 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.063864 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.063910 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.063942 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.063976 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.064008 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.064040 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.064037 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.064120 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.064176 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.064075 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.064278 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.064346 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.064376 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.064381 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.064397 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.064512 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.064597 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.064606 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.064626 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.064669 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.064673 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.064676 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.064746 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.064756 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.064728 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.064829 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.064854 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.064902 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.065021 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.176739 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.178644 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.178710 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.178731 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.178816 4825 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 19:00:12 crc kubenswrapper[4825]: E1007 19:00:12.179490 4825 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.45:6443: connect: connection refused" node="crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.235604 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.259954 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: W1007 19:00:12.284308 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-077a1b7327f6ced2984da45334ddb56baaa4a30436b28e830457722b23ddacc4 WatchSource:0}: Error finding container 077a1b7327f6ced2984da45334ddb56baaa4a30436b28e830457722b23ddacc4: Status 404 returned error can't find the container with id 077a1b7327f6ced2984da45334ddb56baaa4a30436b28e830457722b23ddacc4 Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.294990 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.309153 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.318283 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 19:00:12 crc kubenswrapper[4825]: W1007 19:00:12.327785 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-86a3ae868d4ccb34165c7aa3012ad22d214a89b988daa5e46b270dd049acb36c WatchSource:0}: Error finding container 86a3ae868d4ccb34165c7aa3012ad22d214a89b988daa5e46b270dd049acb36c: Status 404 returned error can't find the container with id 86a3ae868d4ccb34165c7aa3012ad22d214a89b988daa5e46b270dd049acb36c Oct 07 19:00:12 crc kubenswrapper[4825]: E1007 19:00:12.336759 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" interval="800ms" Oct 07 19:00:12 crc kubenswrapper[4825]: W1007 19:00:12.341113 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-affbacd80f365da5ee36062c45150e922959e292ca029f0b055a165764adb271 WatchSource:0}: Error finding container affbacd80f365da5ee36062c45150e922959e292ca029f0b055a165764adb271: Status 404 returned error can't find the container with id affbacd80f365da5ee36062c45150e922959e292ca029f0b055a165764adb271 Oct 07 19:00:12 crc kubenswrapper[4825]: W1007 19:00:12.352511 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-f506a407a1017b73eed5d93f82ae6ff7c8ff32eb9cbf85313f60c4bca8506639 WatchSource:0}: Error finding container f506a407a1017b73eed5d93f82ae6ff7c8ff32eb9cbf85313f60c4bca8506639: Status 404 returned error can't find the container with id f506a407a1017b73eed5d93f82ae6ff7c8ff32eb9cbf85313f60c4bca8506639 Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.580369 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.581772 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.581809 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.581821 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.581845 4825 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 19:00:12 crc kubenswrapper[4825]: E1007 19:00:12.582445 4825 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.45:6443: connect: connection refused" node="crc" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.726353 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Oct 07 19:00:12 crc kubenswrapper[4825]: W1007 19:00:12.726368 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Oct 07 19:00:12 crc kubenswrapper[4825]: E1007 19:00:12.726451 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.45:6443: connect: connection refused" logger="UnhandledError" Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.801047 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"affbacd80f365da5ee36062c45150e922959e292ca029f0b055a165764adb271"} Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.802179 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"86a3ae868d4ccb34165c7aa3012ad22d214a89b988daa5e46b270dd049acb36c"} Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.804075 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"aa294afb790163500a964d1b8027d29ea7cbb54577bd5904416c0f1d50401224"} Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.806274 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"077a1b7327f6ced2984da45334ddb56baaa4a30436b28e830457722b23ddacc4"} Oct 07 19:00:12 crc kubenswrapper[4825]: I1007 19:00:12.807270 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f506a407a1017b73eed5d93f82ae6ff7c8ff32eb9cbf85313f60c4bca8506639"} Oct 07 19:00:12 crc kubenswrapper[4825]: W1007 19:00:12.944647 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Oct 07 19:00:12 crc kubenswrapper[4825]: E1007 19:00:12.944736 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.45:6443: connect: connection refused" logger="UnhandledError" Oct 07 19:00:12 crc kubenswrapper[4825]: W1007 19:00:12.944694 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Oct 07 19:00:12 crc kubenswrapper[4825]: E1007 19:00:12.944827 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.45:6443: connect: connection refused" logger="UnhandledError" Oct 07 19:00:13 crc kubenswrapper[4825]: E1007 19:00:13.138121 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" interval="1.6s" Oct 07 19:00:13 crc kubenswrapper[4825]: W1007 19:00:13.160133 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Oct 07 19:00:13 crc kubenswrapper[4825]: E1007 19:00:13.160314 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.45:6443: connect: connection refused" logger="UnhandledError" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.383060 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.385513 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.385581 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.385596 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.385635 4825 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 19:00:13 crc kubenswrapper[4825]: E1007 19:00:13.386220 4825 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.45:6443: connect: connection refused" node="crc" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.726791 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.812560 4825 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="6f83acfadc30a936b58da7008de9f678cdef4b6ab6650920b800b0bb14541490" exitCode=0 Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.812655 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"6f83acfadc30a936b58da7008de9f678cdef4b6ab6650920b800b0bb14541490"} Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.812708 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.813946 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.814001 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.814019 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.816542 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef"} Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.816602 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88"} Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.819146 4825 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26" exitCode=0 Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.819273 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.819281 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26"} Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.820336 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.820375 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.820393 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.821612 4825 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6" exitCode=0 Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.821730 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6"} Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.821757 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.822130 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.823280 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.823332 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.823374 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.826506 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.826563 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.826580 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.827467 4825 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="679ef9cbf4a8629dbdde9fdc019f83f9a2c7547f01e94274b597491322b6fd50" exitCode=0 Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.827535 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"679ef9cbf4a8629dbdde9fdc019f83f9a2c7547f01e94274b597491322b6fd50"} Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.827603 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.828845 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.828901 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:13 crc kubenswrapper[4825]: I1007 19:00:13.828917 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.725954 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Oct 07 19:00:14 crc kubenswrapper[4825]: E1007 19:00:14.739645 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.45:6443: connect: connection refused" interval="3.2s" Oct 07 19:00:14 crc kubenswrapper[4825]: W1007 19:00:14.807477 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Oct 07 19:00:14 crc kubenswrapper[4825]: E1007 19:00:14.807602 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.45:6443: connect: connection refused" logger="UnhandledError" Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.833845 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"57ebc28bef30bc9400af5461cb62e963762d349457aada53e6d1e9d8777b0d8b"} Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.833900 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"637e969a9a7909f0fc3e029f8bcf47c0c004ce9089ec75c8cc44adcdf333b1dd"} Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.833913 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"60f9716f9ac83aeb270019e1e2dfdc6d4aa8307f40949aeb39a95dd2134cc9cc"} Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.834048 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.834929 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.834957 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.834967 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.837990 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb"} Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.838018 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d"} Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.838083 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.838838 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.838862 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.838875 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.842851 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319"} Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.842894 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9"} Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.842909 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be"} Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.842923 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb"} Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.845354 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.845396 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8"} Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.845213 4825 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8" exitCode=0 Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.846115 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.846140 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.846150 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.847825 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"fa7c70a9c7a88be1caa194e5ddab1f65f60518ca17d860a88ed660a9d033f758"} Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.847915 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.848754 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.848777 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.848787 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.987321 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.991062 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.991124 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.991136 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:14 crc kubenswrapper[4825]: I1007 19:00:14.991168 4825 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 19:00:14 crc kubenswrapper[4825]: E1007 19:00:14.991732 4825 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.45:6443: connect: connection refused" node="crc" Oct 07 19:00:15 crc kubenswrapper[4825]: W1007 19:00:15.184299 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.45:6443: connect: connection refused Oct 07 19:00:15 crc kubenswrapper[4825]: E1007 19:00:15.184400 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.45:6443: connect: connection refused" logger="UnhandledError" Oct 07 19:00:15 crc kubenswrapper[4825]: I1007 19:00:15.860152 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1"} Oct 07 19:00:15 crc kubenswrapper[4825]: I1007 19:00:15.860301 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:15 crc kubenswrapper[4825]: I1007 19:00:15.861765 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:15 crc kubenswrapper[4825]: I1007 19:00:15.861808 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:15 crc kubenswrapper[4825]: I1007 19:00:15.861820 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:15 crc kubenswrapper[4825]: I1007 19:00:15.864151 4825 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970" exitCode=0 Oct 07 19:00:15 crc kubenswrapper[4825]: I1007 19:00:15.864182 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970"} Oct 07 19:00:15 crc kubenswrapper[4825]: I1007 19:00:15.864289 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:15 crc kubenswrapper[4825]: I1007 19:00:15.864327 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:15 crc kubenswrapper[4825]: I1007 19:00:15.864369 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 19:00:15 crc kubenswrapper[4825]: I1007 19:00:15.864327 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:15 crc kubenswrapper[4825]: I1007 19:00:15.864432 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:15 crc kubenswrapper[4825]: I1007 19:00:15.865169 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:15 crc kubenswrapper[4825]: I1007 19:00:15.865240 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:15 crc kubenswrapper[4825]: I1007 19:00:15.865253 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:15 crc kubenswrapper[4825]: I1007 19:00:15.865765 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:15 crc kubenswrapper[4825]: I1007 19:00:15.865793 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:15 crc kubenswrapper[4825]: I1007 19:00:15.865813 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:15 crc kubenswrapper[4825]: I1007 19:00:15.865811 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:15 crc kubenswrapper[4825]: I1007 19:00:15.865827 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:15 crc kubenswrapper[4825]: I1007 19:00:15.865841 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:15 crc kubenswrapper[4825]: I1007 19:00:15.866120 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:15 crc kubenswrapper[4825]: I1007 19:00:15.866263 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:15 crc kubenswrapper[4825]: I1007 19:00:15.866288 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:16 crc kubenswrapper[4825]: I1007 19:00:16.002168 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 19:00:16 crc kubenswrapper[4825]: I1007 19:00:16.307617 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 19:00:16 crc kubenswrapper[4825]: I1007 19:00:16.601942 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 19:00:16 crc kubenswrapper[4825]: I1007 19:00:16.844954 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 19:00:16 crc kubenswrapper[4825]: I1007 19:00:16.873493 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c"} Oct 07 19:00:16 crc kubenswrapper[4825]: I1007 19:00:16.873560 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15"} Oct 07 19:00:16 crc kubenswrapper[4825]: I1007 19:00:16.873582 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:16 crc kubenswrapper[4825]: I1007 19:00:16.873588 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:16 crc kubenswrapper[4825]: I1007 19:00:16.873741 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:16 crc kubenswrapper[4825]: I1007 19:00:16.873591 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa"} Oct 07 19:00:16 crc kubenswrapper[4825]: I1007 19:00:16.873854 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96"} Oct 07 19:00:16 crc kubenswrapper[4825]: I1007 19:00:16.875655 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:16 crc kubenswrapper[4825]: I1007 19:00:16.875686 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:16 crc kubenswrapper[4825]: I1007 19:00:16.875721 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:16 crc kubenswrapper[4825]: I1007 19:00:16.875737 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:16 crc kubenswrapper[4825]: I1007 19:00:16.875722 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:16 crc kubenswrapper[4825]: I1007 19:00:16.875778 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:16 crc kubenswrapper[4825]: I1007 19:00:16.875802 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:16 crc kubenswrapper[4825]: I1007 19:00:16.875839 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:16 crc kubenswrapper[4825]: I1007 19:00:16.875850 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:17 crc kubenswrapper[4825]: I1007 19:00:17.883929 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be"} Oct 07 19:00:17 crc kubenswrapper[4825]: I1007 19:00:17.884075 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:17 crc kubenswrapper[4825]: I1007 19:00:17.884133 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:17 crc kubenswrapper[4825]: I1007 19:00:17.884143 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:17 crc kubenswrapper[4825]: I1007 19:00:17.885764 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:17 crc kubenswrapper[4825]: I1007 19:00:17.885808 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:17 crc kubenswrapper[4825]: I1007 19:00:17.885816 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:17 crc kubenswrapper[4825]: I1007 19:00:17.885825 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:17 crc kubenswrapper[4825]: I1007 19:00:17.885846 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:17 crc kubenswrapper[4825]: I1007 19:00:17.885862 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:17 crc kubenswrapper[4825]: I1007 19:00:17.885903 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:17 crc kubenswrapper[4825]: I1007 19:00:17.885950 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:17 crc kubenswrapper[4825]: I1007 19:00:17.885962 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:18 crc kubenswrapper[4825]: I1007 19:00:18.192077 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:18 crc kubenswrapper[4825]: I1007 19:00:18.193965 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:18 crc kubenswrapper[4825]: I1007 19:00:18.194026 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:18 crc kubenswrapper[4825]: I1007 19:00:18.194045 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:18 crc kubenswrapper[4825]: I1007 19:00:18.194098 4825 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 19:00:18 crc kubenswrapper[4825]: I1007 19:00:18.437843 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 19:00:18 crc kubenswrapper[4825]: I1007 19:00:18.887508 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:18 crc kubenswrapper[4825]: I1007 19:00:18.887565 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:18 crc kubenswrapper[4825]: I1007 19:00:18.889218 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:18 crc kubenswrapper[4825]: I1007 19:00:18.889270 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:18 crc kubenswrapper[4825]: I1007 19:00:18.889282 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:18 crc kubenswrapper[4825]: I1007 19:00:18.889338 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:18 crc kubenswrapper[4825]: I1007 19:00:18.889394 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:18 crc kubenswrapper[4825]: I1007 19:00:18.889408 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:19 crc kubenswrapper[4825]: I1007 19:00:19.845576 4825 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 07 19:00:19 crc kubenswrapper[4825]: I1007 19:00:19.845735 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 19:00:20 crc kubenswrapper[4825]: I1007 19:00:20.472947 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 19:00:20 crc kubenswrapper[4825]: I1007 19:00:20.473218 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:20 crc kubenswrapper[4825]: I1007 19:00:20.475702 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:20 crc kubenswrapper[4825]: I1007 19:00:20.475782 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:20 crc kubenswrapper[4825]: I1007 19:00:20.475814 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:20 crc kubenswrapper[4825]: I1007 19:00:20.486599 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 19:00:20 crc kubenswrapper[4825]: I1007 19:00:20.894867 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:20 crc kubenswrapper[4825]: I1007 19:00:20.895032 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 19:00:20 crc kubenswrapper[4825]: I1007 19:00:20.896068 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:20 crc kubenswrapper[4825]: I1007 19:00:20.896107 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:20 crc kubenswrapper[4825]: I1007 19:00:20.896117 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:21 crc kubenswrapper[4825]: I1007 19:00:21.239453 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 07 19:00:21 crc kubenswrapper[4825]: I1007 19:00:21.239702 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:21 crc kubenswrapper[4825]: I1007 19:00:21.247092 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:21 crc kubenswrapper[4825]: I1007 19:00:21.247157 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:21 crc kubenswrapper[4825]: I1007 19:00:21.247178 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:21 crc kubenswrapper[4825]: E1007 19:00:21.888165 4825 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 07 19:00:21 crc kubenswrapper[4825]: I1007 19:00:21.896955 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:21 crc kubenswrapper[4825]: I1007 19:00:21.898407 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:21 crc kubenswrapper[4825]: I1007 19:00:21.898458 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:21 crc kubenswrapper[4825]: I1007 19:00:21.898480 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:23 crc kubenswrapper[4825]: I1007 19:00:23.236167 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 19:00:23 crc kubenswrapper[4825]: I1007 19:00:23.237211 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:23 crc kubenswrapper[4825]: I1007 19:00:23.238754 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:23 crc kubenswrapper[4825]: I1007 19:00:23.238808 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:23 crc kubenswrapper[4825]: I1007 19:00:23.238827 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:25 crc kubenswrapper[4825]: W1007 19:00:25.495720 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 07 19:00:25 crc kubenswrapper[4825]: I1007 19:00:25.495869 4825 trace.go:236] Trace[2139446933]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 19:00:15.494) (total time: 10001ms): Oct 07 19:00:25 crc kubenswrapper[4825]: Trace[2139446933]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (19:00:25.495) Oct 07 19:00:25 crc kubenswrapper[4825]: Trace[2139446933]: [10.001671495s] [10.001671495s] END Oct 07 19:00:25 crc kubenswrapper[4825]: E1007 19:00:25.495903 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 07 19:00:25 crc kubenswrapper[4825]: W1007 19:00:25.685364 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 07 19:00:25 crc kubenswrapper[4825]: I1007 19:00:25.685492 4825 trace.go:236] Trace[517843945]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 19:00:15.683) (total time: 10001ms): Oct 07 19:00:25 crc kubenswrapper[4825]: Trace[517843945]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (19:00:25.685) Oct 07 19:00:25 crc kubenswrapper[4825]: Trace[517843945]: [10.001911253s] [10.001911253s] END Oct 07 19:00:25 crc kubenswrapper[4825]: E1007 19:00:25.685521 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 07 19:00:25 crc kubenswrapper[4825]: I1007 19:00:25.727148 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 07 19:00:25 crc kubenswrapper[4825]: I1007 19:00:25.760894 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 07 19:00:25 crc kubenswrapper[4825]: I1007 19:00:25.761193 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:25 crc kubenswrapper[4825]: I1007 19:00:25.763127 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:25 crc kubenswrapper[4825]: I1007 19:00:25.763167 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:25 crc kubenswrapper[4825]: I1007 19:00:25.763183 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:26 crc kubenswrapper[4825]: I1007 19:00:26.309512 4825 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 07 19:00:26 crc kubenswrapper[4825]: I1007 19:00:26.309591 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 07 19:00:26 crc kubenswrapper[4825]: I1007 19:00:26.319271 4825 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 07 19:00:26 crc kubenswrapper[4825]: I1007 19:00:26.319341 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 07 19:00:26 crc kubenswrapper[4825]: I1007 19:00:26.611704 4825 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]log ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]etcd ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/generic-apiserver-start-informers ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/priority-and-fairness-filter ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/start-apiextensions-informers ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/start-apiextensions-controllers ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/crd-informer-synced ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/start-system-namespaces-controller ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 07 19:00:26 crc kubenswrapper[4825]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Oct 07 19:00:26 crc kubenswrapper[4825]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/bootstrap-controller ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/start-kube-aggregator-informers ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/apiservice-registration-controller ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/apiservice-discovery-controller ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]autoregister-completion ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/apiservice-openapi-controller ok Oct 07 19:00:26 crc kubenswrapper[4825]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 07 19:00:26 crc kubenswrapper[4825]: livez check failed Oct 07 19:00:26 crc kubenswrapper[4825]: I1007 19:00:26.612393 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 19:00:29 crc kubenswrapper[4825]: I1007 19:00:29.845836 4825 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 07 19:00:29 crc kubenswrapper[4825]: I1007 19:00:29.845939 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 07 19:00:29 crc kubenswrapper[4825]: I1007 19:00:29.848834 4825 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 07 19:00:31 crc kubenswrapper[4825]: E1007 19:00:31.317129 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.320529 4825 trace.go:236] Trace[2073925003]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 19:00:19.501) (total time: 11818ms): Oct 07 19:00:31 crc kubenswrapper[4825]: Trace[2073925003]: ---"Objects listed" error: 11818ms (19:00:31.320) Oct 07 19:00:31 crc kubenswrapper[4825]: Trace[2073925003]: [11.818623218s] [11.818623218s] END Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.320590 4825 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 07 19:00:31 crc kubenswrapper[4825]: E1007 19:00:31.321567 4825 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.321863 4825 trace.go:236] Trace[1216707248]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 19:00:19.204) (total time: 12117ms): Oct 07 19:00:31 crc kubenswrapper[4825]: Trace[1216707248]: ---"Objects listed" error: 12117ms (19:00:31.321) Oct 07 19:00:31 crc kubenswrapper[4825]: Trace[1216707248]: [12.117188367s] [12.117188367s] END Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.321902 4825 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.323335 4825 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.383809 4825 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:45010->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.383895 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:45010->192.168.126.11:17697: read: connection reset by peer" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.608559 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.609161 4825 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.609286 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.612921 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.725179 4825 apiserver.go:52] "Watching apiserver" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.729901 4825 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.730173 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-xvdcs"] Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.730612 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.730711 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.730790 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.730807 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.730839 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.730993 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:31 crc kubenswrapper[4825]: E1007 19:00:31.731045 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:00:31 crc kubenswrapper[4825]: E1007 19:00:31.731039 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:00:31 crc kubenswrapper[4825]: E1007 19:00:31.730978 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.731561 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xvdcs" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.732884 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.733468 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.733809 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.734974 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.735271 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.735503 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.736357 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.736523 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.736604 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.738570 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.738656 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.738677 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.755992 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.771260 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.786979 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.796838 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.799501 4825 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.810571 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.819449 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.827976 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.830946 4825 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.837058 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-6bwfw"] Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.837817 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-b6jcs"] Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.838003 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-zk9x9"] Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.838002 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.838202 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.838422 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zk9x9" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.840253 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.840253 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.840262 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.840555 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.840927 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.841165 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.841908 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.842061 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.842083 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.842191 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.842506 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.843985 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.846458 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.859770 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.872271 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.883703 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.896006 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.910586 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.918910 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926421 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926475 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926501 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926530 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926549 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926627 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926646 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926664 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926683 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926702 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926719 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926736 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926751 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926767 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926786 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926802 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926818 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926842 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926862 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926882 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926920 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926937 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926956 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926976 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.926997 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927016 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927036 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927035 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927053 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927077 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927123 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927155 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927182 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927274 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927332 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927355 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927376 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927397 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927427 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927447 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927469 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927481 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927509 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927544 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927571 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927597 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927624 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927647 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927674 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927699 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927722 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927748 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927753 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927772 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927758 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927797 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927889 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927893 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927919 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927944 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927962 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.927981 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928002 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928020 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928039 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928057 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928076 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928093 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928111 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928130 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928153 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928174 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928216 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928254 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928271 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928288 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928304 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928319 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928345 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928362 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928399 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928402 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928428 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928448 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928468 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928488 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928500 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928512 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928548 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928568 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928591 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928580 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928601 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928614 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928873 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928902 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928933 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928963 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928986 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.928998 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929009 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929108 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929158 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929192 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929215 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929253 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929266 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929276 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929334 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929363 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929369 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929471 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929477 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929503 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929588 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929620 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929650 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929679 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929700 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929707 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929737 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929764 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929790 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929815 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929813 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929847 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929917 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929935 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929914 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.929952 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930004 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930056 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930107 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930142 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930175 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930191 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930203 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930249 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930275 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930281 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930337 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930346 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930392 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930470 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930497 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930604 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930652 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930672 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930700 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930707 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930758 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930770 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930821 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930883 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930916 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930949 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.930978 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931133 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931164 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931192 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931244 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931260 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931277 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931309 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931340 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931369 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931397 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931423 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931454 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931485 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931512 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931536 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931539 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931561 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931684 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931789 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931828 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931885 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931915 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931924 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931966 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931952 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.931979 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932038 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932068 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932348 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932388 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932423 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932455 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932487 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932518 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932601 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932642 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932683 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932716 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932747 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932776 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932809 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932837 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932869 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.933906 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.933996 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.934083 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.935422 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.935490 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.935523 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.935551 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.935580 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.935620 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.935675 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.935705 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.935731 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.935758 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.935820 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.935848 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.935903 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.935961 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.935989 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936017 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936043 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936068 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.935934 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936101 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936130 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936184 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936212 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936267 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936294 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936319 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936395 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936450 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936479 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936507 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936581 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-system-cni-dir\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936641 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-cnibin\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936665 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936734 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-multus-cni-dir\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936756 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-host-run-multus-certs\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936813 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936840 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e48a4135-d1b9-4dfb-89fc-be393f7937aa-cni-binary-copy\") pod \"multus-additional-cni-plugins-6bwfw\" (UID: \"e48a4135-d1b9-4dfb-89fc-be393f7937aa\") " pod="openshift-multus/multus-additional-cni-plugins-6bwfw" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937057 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937098 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-host-var-lib-kubelet\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937130 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937158 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937206 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937354 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e48a4135-d1b9-4dfb-89fc-be393f7937aa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6bwfw\" (UID: \"e48a4135-d1b9-4dfb-89fc-be393f7937aa\") " pod="openshift-multus/multus-additional-cni-plugins-6bwfw" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937382 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3f038b04-14c9-421c-91e9-ab654b6c4ac8-hosts-file\") pod \"node-resolver-xvdcs\" (UID: \"3f038b04-14c9-421c-91e9-ab654b6c4ac8\") " pod="openshift-dns/node-resolver-xvdcs" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937437 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937467 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937520 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4krj4\" (UniqueName: \"kubernetes.io/projected/3f038b04-14c9-421c-91e9-ab654b6c4ac8-kube-api-access-4krj4\") pod \"node-resolver-xvdcs\" (UID: \"3f038b04-14c9-421c-91e9-ab654b6c4ac8\") " pod="openshift-dns/node-resolver-xvdcs" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937566 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e48a4135-d1b9-4dfb-89fc-be393f7937aa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6bwfw\" (UID: \"e48a4135-d1b9-4dfb-89fc-be393f7937aa\") " pod="openshift-multus/multus-additional-cni-plugins-6bwfw" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937622 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937650 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937701 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a57a780f-aa1f-4e0f-9a90-5e6a70f89d18-mcd-auth-proxy-config\") pod \"machine-config-daemon-b6jcs\" (UID: \"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\") " pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937725 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/44f62e96-26a6-4bfe-8e8c-6884216bd363-multus-daemon-config\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937780 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937808 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e48a4135-d1b9-4dfb-89fc-be393f7937aa-os-release\") pod \"multus-additional-cni-plugins-6bwfw\" (UID: \"e48a4135-d1b9-4dfb-89fc-be393f7937aa\") " pod="openshift-multus/multus-additional-cni-plugins-6bwfw" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937831 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a57a780f-aa1f-4e0f-9a90-5e6a70f89d18-proxy-tls\") pod \"machine-config-daemon-b6jcs\" (UID: \"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\") " pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937879 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-host-run-k8s-cni-cncf-io\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937911 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-host-var-lib-cni-bin\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937984 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938010 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e48a4135-d1b9-4dfb-89fc-be393f7937aa-cnibin\") pod \"multus-additional-cni-plugins-6bwfw\" (UID: \"e48a4135-d1b9-4dfb-89fc-be393f7937aa\") " pod="openshift-multus/multus-additional-cni-plugins-6bwfw" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938060 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wjlg\" (UniqueName: \"kubernetes.io/projected/a57a780f-aa1f-4e0f-9a90-5e6a70f89d18-kube-api-access-2wjlg\") pod \"machine-config-daemon-b6jcs\" (UID: \"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\") " pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938087 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-multus-socket-dir-parent\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938136 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a57a780f-aa1f-4e0f-9a90-5e6a70f89d18-rootfs\") pod \"machine-config-daemon-b6jcs\" (UID: \"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\") " pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938161 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-hostroot\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938182 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-etc-kubernetes\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938253 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwfhp\" (UniqueName: \"kubernetes.io/projected/e48a4135-d1b9-4dfb-89fc-be393f7937aa-kube-api-access-xwfhp\") pod \"multus-additional-cni-plugins-6bwfw\" (UID: \"e48a4135-d1b9-4dfb-89fc-be393f7937aa\") " pod="openshift-multus/multus-additional-cni-plugins-6bwfw" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938283 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-os-release\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938304 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/44f62e96-26a6-4bfe-8e8c-6884216bd363-cni-binary-copy\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938352 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-host-run-netns\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938377 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2gzp\" (UniqueName: \"kubernetes.io/projected/44f62e96-26a6-4bfe-8e8c-6884216bd363-kube-api-access-k2gzp\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938432 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938462 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938508 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e48a4135-d1b9-4dfb-89fc-be393f7937aa-system-cni-dir\") pod \"multus-additional-cni-plugins-6bwfw\" (UID: \"e48a4135-d1b9-4dfb-89fc-be393f7937aa\") " pod="openshift-multus/multus-additional-cni-plugins-6bwfw" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938538 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-host-var-lib-cni-multus\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938562 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-multus-conf-dir\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938633 4825 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938652 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938695 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938711 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938726 4825 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938741 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938780 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939349 4825 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939374 4825 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939389 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939405 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939423 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939441 4825 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939457 4825 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939472 4825 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939487 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939501 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939517 4825 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939531 4825 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939549 4825 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939565 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939581 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939597 4825 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939611 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939625 4825 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939639 4825 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939654 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939668 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939683 4825 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939698 4825 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939712 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939728 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939742 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939756 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939755 4825 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1" exitCode=255 Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939775 4825 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939824 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.940508 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1"} Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932152 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.941003 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932508 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932533 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932552 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932591 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932845 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932852 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932854 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.932948 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.933143 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.933179 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.933214 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.933330 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.933642 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.933716 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.933912 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.934133 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.934449 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.934469 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.934468 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.934918 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.934942 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.935208 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.935261 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.935564 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.935580 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.935608 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.935854 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936118 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936411 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936558 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936807 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.936894 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937291 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937371 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937556 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937634 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937753 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937796 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.937957 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938048 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938428 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938426 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938463 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938521 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938532 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.938819 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939072 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939210 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939324 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939358 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939473 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939530 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939751 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939827 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939837 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.939925 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: E1007 19:00:31.940015 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:00:32.439994909 +0000 UTC m=+21.262033556 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.940113 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.940362 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.940386 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.940427 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.940711 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.940839 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.940985 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.941164 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.942003 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: E1007 19:00:31.942108 4825 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.942110 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.942161 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.942659 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.942718 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.942744 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.942766 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.942943 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 19:00:31 crc kubenswrapper[4825]: E1007 19:00:31.943380 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 19:00:32.443364225 +0000 UTC m=+21.265402862 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.943656 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.943898 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.943961 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.944162 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.944527 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.944572 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.944697 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.944827 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.944986 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.944915 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.945198 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.945508 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.945670 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.945775 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.945793 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.945869 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.945962 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.946206 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.946211 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.946291 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.946940 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.947081 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.947520 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.947869 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.947872 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.947901 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.947982 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.948168 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.948183 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.944724 4825 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.948433 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.948449 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 19:00:31 crc kubenswrapper[4825]: E1007 19:00:31.948591 4825 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 19:00:31 crc kubenswrapper[4825]: E1007 19:00:31.948620 4825 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 19:00:31 crc kubenswrapper[4825]: E1007 19:00:31.948692 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 19:00:32.448662544 +0000 UTC m=+21.270701221 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.949143 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.949185 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.949290 4825 scope.go:117] "RemoveContainer" containerID="f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.949323 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.949449 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.950452 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.950660 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.952648 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.952806 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.953357 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.953373 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.953721 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.956472 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.957705 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.961796 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.962183 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.962410 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.963146 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.963671 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.964055 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.964245 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.964438 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.964619 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.964958 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.965365 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.965706 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.965907 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.965945 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.966295 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.968827 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.969166 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.969449 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.969525 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.969858 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.969945 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.970200 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.970385 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.972797 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: E1007 19:00:31.972891 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 19:00:31 crc kubenswrapper[4825]: E1007 19:00:31.972923 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.972916 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: E1007 19:00:31.972947 4825 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.973409 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.973835 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: E1007 19:00:31.974032 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 19:00:32.473001709 +0000 UTC m=+21.295040356 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.974287 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.975161 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: E1007 19:00:31.975756 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 19:00:31 crc kubenswrapper[4825]: E1007 19:00:31.975818 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 19:00:31 crc kubenswrapper[4825]: E1007 19:00:31.975842 4825 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:00:31 crc kubenswrapper[4825]: E1007 19:00:31.975943 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 19:00:32.475916442 +0000 UTC m=+21.297955279 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.976745 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.978359 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.979498 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.979694 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.981619 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.982018 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.983308 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.983447 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.984110 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.984676 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.985502 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.985762 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.987548 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.989783 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.990589 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.991764 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.999039 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:00:31 crc kubenswrapper[4825]: I1007 19:00:31.999755 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.002113 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.022862 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.034524 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.043830 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e48a4135-d1b9-4dfb-89fc-be393f7937aa-system-cni-dir\") pod \"multus-additional-cni-plugins-6bwfw\" (UID: \"e48a4135-d1b9-4dfb-89fc-be393f7937aa\") " pod="openshift-multus/multus-additional-cni-plugins-6bwfw" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.043876 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-host-var-lib-cni-multus\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.043899 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-multus-conf-dir\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.043930 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-cnibin\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.043952 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-system-cni-dir\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.043993 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-host-run-multus-certs\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.044016 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.044032 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-multus-cni-dir\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.044060 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e48a4135-d1b9-4dfb-89fc-be393f7937aa-cni-binary-copy\") pod \"multus-additional-cni-plugins-6bwfw\" (UID: \"e48a4135-d1b9-4dfb-89fc-be393f7937aa\") " pod="openshift-multus/multus-additional-cni-plugins-6bwfw" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.044059 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-host-var-lib-cni-multus\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.044115 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-host-var-lib-kubelet\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.044079 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-host-var-lib-kubelet\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.044168 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-multus-conf-dir\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.044214 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.044414 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.045662 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.050934 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.053240 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.054387 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-cnibin\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.054462 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-system-cni-dir\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.054487 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-host-run-multus-certs\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.054515 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.054572 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-multus-cni-dir\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.064510 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e48a4135-d1b9-4dfb-89fc-be393f7937aa-system-cni-dir\") pod \"multus-additional-cni-plugins-6bwfw\" (UID: \"e48a4135-d1b9-4dfb-89fc-be393f7937aa\") " pod="openshift-multus/multus-additional-cni-plugins-6bwfw" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.064632 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e48a4135-d1b9-4dfb-89fc-be393f7937aa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6bwfw\" (UID: \"e48a4135-d1b9-4dfb-89fc-be393f7937aa\") " pod="openshift-multus/multus-additional-cni-plugins-6bwfw" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.064658 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3f038b04-14c9-421c-91e9-ab654b6c4ac8-hosts-file\") pod \"node-resolver-xvdcs\" (UID: \"3f038b04-14c9-421c-91e9-ab654b6c4ac8\") " pod="openshift-dns/node-resolver-xvdcs" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.064680 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4krj4\" (UniqueName: \"kubernetes.io/projected/3f038b04-14c9-421c-91e9-ab654b6c4ac8-kube-api-access-4krj4\") pod \"node-resolver-xvdcs\" (UID: \"3f038b04-14c9-421c-91e9-ab654b6c4ac8\") " pod="openshift-dns/node-resolver-xvdcs" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.064697 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e48a4135-d1b9-4dfb-89fc-be393f7937aa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6bwfw\" (UID: \"e48a4135-d1b9-4dfb-89fc-be393f7937aa\") " pod="openshift-multus/multus-additional-cni-plugins-6bwfw" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.064721 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a57a780f-aa1f-4e0f-9a90-5e6a70f89d18-mcd-auth-proxy-config\") pod \"machine-config-daemon-b6jcs\" (UID: \"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\") " pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.064745 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/44f62e96-26a6-4bfe-8e8c-6884216bd363-multus-daemon-config\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.064760 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e48a4135-d1b9-4dfb-89fc-be393f7937aa-os-release\") pod \"multus-additional-cni-plugins-6bwfw\" (UID: \"e48a4135-d1b9-4dfb-89fc-be393f7937aa\") " pod="openshift-multus/multus-additional-cni-plugins-6bwfw" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.064775 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a57a780f-aa1f-4e0f-9a90-5e6a70f89d18-proxy-tls\") pod \"machine-config-daemon-b6jcs\" (UID: \"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\") " pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.064791 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-host-run-k8s-cni-cncf-io\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.064807 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-host-var-lib-cni-bin\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.064826 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e48a4135-d1b9-4dfb-89fc-be393f7937aa-cnibin\") pod \"multus-additional-cni-plugins-6bwfw\" (UID: \"e48a4135-d1b9-4dfb-89fc-be393f7937aa\") " pod="openshift-multus/multus-additional-cni-plugins-6bwfw" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.064841 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wjlg\" (UniqueName: \"kubernetes.io/projected/a57a780f-aa1f-4e0f-9a90-5e6a70f89d18-kube-api-access-2wjlg\") pod \"machine-config-daemon-b6jcs\" (UID: \"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\") " pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.064858 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-multus-socket-dir-parent\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.064883 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-etc-kubernetes\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.064900 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a57a780f-aa1f-4e0f-9a90-5e6a70f89d18-rootfs\") pod \"machine-config-daemon-b6jcs\" (UID: \"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\") " pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.064917 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-hostroot\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.064935 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwfhp\" (UniqueName: \"kubernetes.io/projected/e48a4135-d1b9-4dfb-89fc-be393f7937aa-kube-api-access-xwfhp\") pod \"multus-additional-cni-plugins-6bwfw\" (UID: \"e48a4135-d1b9-4dfb-89fc-be393f7937aa\") " pod="openshift-multus/multus-additional-cni-plugins-6bwfw" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.064969 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-os-release\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.064987 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/44f62e96-26a6-4bfe-8e8c-6884216bd363-cni-binary-copy\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065005 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-host-run-netns\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065025 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2gzp\" (UniqueName: \"kubernetes.io/projected/44f62e96-26a6-4bfe-8e8c-6884216bd363-kube-api-access-k2gzp\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065161 4825 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065174 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065185 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065195 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065204 4825 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065215 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065241 4825 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065251 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065261 4825 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065271 4825 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065282 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065291 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065301 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065311 4825 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065320 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065329 4825 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065339 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065348 4825 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065358 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065367 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065377 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065387 4825 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065396 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065405 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065416 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065469 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-host-var-lib-cni-bin\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.065536 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.066418 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3f038b04-14c9-421c-91e9-ab654b6c4ac8-hosts-file\") pod \"node-resolver-xvdcs\" (UID: \"3f038b04-14c9-421c-91e9-ab654b6c4ac8\") " pod="openshift-dns/node-resolver-xvdcs" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.066436 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-multus-socket-dir-parent\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.066561 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-etc-kubernetes\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.066578 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a57a780f-aa1f-4e0f-9a90-5e6a70f89d18-rootfs\") pod \"machine-config-daemon-b6jcs\" (UID: \"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\") " pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.066591 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-hostroot\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.066610 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e48a4135-d1b9-4dfb-89fc-be393f7937aa-cni-binary-copy\") pod \"multus-additional-cni-plugins-6bwfw\" (UID: \"e48a4135-d1b9-4dfb-89fc-be393f7937aa\") " pod="openshift-multus/multus-additional-cni-plugins-6bwfw" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.066705 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e48a4135-d1b9-4dfb-89fc-be393f7937aa-os-release\") pod \"multus-additional-cni-plugins-6bwfw\" (UID: \"e48a4135-d1b9-4dfb-89fc-be393f7937aa\") " pod="openshift-multus/multus-additional-cni-plugins-6bwfw" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.066702 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-host-run-k8s-cni-cncf-io\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.066741 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.066815 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-os-release\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067057 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a57a780f-aa1f-4e0f-9a90-5e6a70f89d18-mcd-auth-proxy-config\") pod \"machine-config-daemon-b6jcs\" (UID: \"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\") " pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067099 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e48a4135-d1b9-4dfb-89fc-be393f7937aa-cnibin\") pod \"multus-additional-cni-plugins-6bwfw\" (UID: \"e48a4135-d1b9-4dfb-89fc-be393f7937aa\") " pod="openshift-multus/multus-additional-cni-plugins-6bwfw" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067141 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44f62e96-26a6-4bfe-8e8c-6884216bd363-host-run-netns\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067174 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/44f62e96-26a6-4bfe-8e8c-6884216bd363-cni-binary-copy\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067374 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067395 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067409 4825 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067419 4825 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067429 4825 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067494 4825 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067518 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067535 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067549 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067564 4825 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067576 4825 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067589 4825 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067591 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e48a4135-d1b9-4dfb-89fc-be393f7937aa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6bwfw\" (UID: \"e48a4135-d1b9-4dfb-89fc-be393f7937aa\") " pod="openshift-multus/multus-additional-cni-plugins-6bwfw" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067602 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067629 4825 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067643 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067655 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067665 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067675 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067685 4825 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067695 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067704 4825 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067713 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067724 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067734 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067743 4825 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067752 4825 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067761 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067770 4825 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067780 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067789 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067798 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067807 4825 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067817 4825 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067827 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067838 4825 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067847 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067857 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067866 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067876 4825 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067885 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067884 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/44f62e96-26a6-4bfe-8e8c-6884216bd363-multus-daemon-config\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067894 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067938 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067948 4825 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067957 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067967 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067976 4825 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067986 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.067998 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068010 4825 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068022 4825 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068035 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068044 4825 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068053 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068061 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068071 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068084 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068096 4825 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068130 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068140 4825 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068149 4825 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068159 4825 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068170 4825 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068181 4825 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068192 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068205 4825 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068218 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068247 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068263 4825 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068275 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068288 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068300 4825 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068312 4825 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068323 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068334 4825 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068346 4825 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068357 4825 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068368 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068379 4825 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068391 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068402 4825 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068414 4825 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068425 4825 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068439 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068451 4825 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068463 4825 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068475 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068486 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068498 4825 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068512 4825 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068523 4825 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068534 4825 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068545 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068572 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068582 4825 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068593 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068604 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068620 4825 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068632 4825 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068643 4825 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068657 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068666 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068677 4825 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068687 4825 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068697 4825 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068707 4825 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068716 4825 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068726 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068738 4825 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068749 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068760 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068773 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068784 4825 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068795 4825 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068807 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068819 4825 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068834 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068847 4825 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068859 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068871 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068882 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068893 4825 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068904 4825 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068915 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068924 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068934 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068942 4825 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068951 4825 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068961 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068970 4825 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.068979 4825 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.069944 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e48a4135-d1b9-4dfb-89fc-be393f7937aa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6bwfw\" (UID: \"e48a4135-d1b9-4dfb-89fc-be393f7937aa\") " pod="openshift-multus/multus-additional-cni-plugins-6bwfw" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.071692 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a57a780f-aa1f-4e0f-9a90-5e6a70f89d18-proxy-tls\") pod \"machine-config-daemon-b6jcs\" (UID: \"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\") " pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.076598 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.076871 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.098370 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4krj4\" (UniqueName: \"kubernetes.io/projected/3f038b04-14c9-421c-91e9-ab654b6c4ac8-kube-api-access-4krj4\") pod \"node-resolver-xvdcs\" (UID: \"3f038b04-14c9-421c-91e9-ab654b6c4ac8\") " pod="openshift-dns/node-resolver-xvdcs" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.098639 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2gzp\" (UniqueName: \"kubernetes.io/projected/44f62e96-26a6-4bfe-8e8c-6884216bd363-kube-api-access-k2gzp\") pod \"multus-zk9x9\" (UID: \"44f62e96-26a6-4bfe-8e8c-6884216bd363\") " pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.099565 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wjlg\" (UniqueName: \"kubernetes.io/projected/a57a780f-aa1f-4e0f-9a90-5e6a70f89d18-kube-api-access-2wjlg\") pod \"machine-config-daemon-b6jcs\" (UID: \"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\") " pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.103677 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwfhp\" (UniqueName: \"kubernetes.io/projected/e48a4135-d1b9-4dfb-89fc-be393f7937aa-kube-api-access-xwfhp\") pod \"multus-additional-cni-plugins-6bwfw\" (UID: \"e48a4135-d1b9-4dfb-89fc-be393f7937aa\") " pod="openshift-multus/multus-additional-cni-plugins-6bwfw" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.106290 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.136179 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.146837 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.156844 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.158675 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.169489 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.170403 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.171195 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: W1007 19:00:32.176322 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode48a4135_d1b9_4dfb_89fc_be393f7937aa.slice/crio-cf6ddbe303411b3eccfe8b3c0ca6e2cfb5c06dff89a307601edb943b0a5c9860 WatchSource:0}: Error finding container cf6ddbe303411b3eccfe8b3c0ca6e2cfb5c06dff89a307601edb943b0a5c9860: Status 404 returned error can't find the container with id cf6ddbe303411b3eccfe8b3c0ca6e2cfb5c06dff89a307601edb943b0a5c9860 Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.181485 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zk9x9" Oct 07 19:00:32 crc kubenswrapper[4825]: W1007 19:00:32.182384 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda57a780f_aa1f_4e0f_9a90_5e6a70f89d18.slice/crio-cbc110571e60b8bd56aa4810db1dbd5d2eb4db59470fbeaea69a7a45f6e44534 WatchSource:0}: Error finding container cbc110571e60b8bd56aa4810db1dbd5d2eb4db59470fbeaea69a7a45f6e44534: Status 404 returned error can't find the container with id cbc110571e60b8bd56aa4810db1dbd5d2eb4db59470fbeaea69a7a45f6e44534 Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.186357 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.204704 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.205248 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6lvdm"] Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.206250 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: W1007 19:00:32.207467 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44f62e96_26a6_4bfe_8e8c_6884216bd363.slice/crio-5315343a99827990084f7b04390314540a04c660b2924bbfa1375e5de3cddf35 WatchSource:0}: Error finding container 5315343a99827990084f7b04390314540a04c660b2924bbfa1375e5de3cddf35: Status 404 returned error can't find the container with id 5315343a99827990084f7b04390314540a04c660b2924bbfa1375e5de3cddf35 Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.208010 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.208352 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.209619 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.211803 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.212529 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.212719 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.213360 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.223148 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.231179 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.241728 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.255722 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.270138 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-run-systemd\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.270174 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/11546b62-cdda-449d-963e-418c2d4b6e46-ovn-node-metrics-cert\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.270197 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-kubelet\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.270220 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/11546b62-cdda-449d-963e-418c2d4b6e46-env-overrides\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.270258 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-run-netns\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.270278 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-var-lib-openvswitch\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.270296 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-node-log\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.270311 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-cni-bin\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.270327 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/11546b62-cdda-449d-963e-418c2d4b6e46-ovnkube-script-lib\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.270344 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-run-openvswitch\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.270361 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-run-ovn-kubernetes\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.270397 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-cni-netd\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.270415 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-run-ovn\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.270434 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.270466 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-systemd-units\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.270484 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/11546b62-cdda-449d-963e-418c2d4b6e46-ovnkube-config\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.270503 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-log-socket\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.270520 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-etc-openvswitch\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.270545 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-slash\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.270563 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmmv8\" (UniqueName: \"kubernetes.io/projected/11546b62-cdda-449d-963e-418c2d4b6e46-kube-api-access-qmmv8\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.274013 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.287487 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.305522 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.318162 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.329665 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.338888 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.351511 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.366565 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.367562 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.371409 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-var-lib-openvswitch\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.371439 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-node-log\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.371461 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-cni-bin\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.371479 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/11546b62-cdda-449d-963e-418c2d4b6e46-ovnkube-script-lib\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.371498 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-run-openvswitch\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.371514 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-run-ovn-kubernetes\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.371546 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-cni-netd\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.371560 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-run-ovn\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.371578 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.371605 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-systemd-units\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.371631 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/11546b62-cdda-449d-963e-418c2d4b6e46-ovnkube-config\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.371650 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-log-socket\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.371674 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-slash\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.371690 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-etc-openvswitch\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.371709 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmmv8\" (UniqueName: \"kubernetes.io/projected/11546b62-cdda-449d-963e-418c2d4b6e46-kube-api-access-qmmv8\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.371725 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-run-systemd\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.371741 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/11546b62-cdda-449d-963e-418c2d4b6e46-ovn-node-metrics-cert\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.371759 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-kubelet\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.371775 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/11546b62-cdda-449d-963e-418c2d4b6e46-env-overrides\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.371796 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-run-netns\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.371851 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-run-netns\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.371884 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-var-lib-openvswitch\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.371906 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-node-log\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.371928 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-cni-bin\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.372567 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-log-socket\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.372629 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-run-openvswitch\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.372663 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-run-ovn-kubernetes\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.372703 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-cni-netd\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.372728 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/11546b62-cdda-449d-963e-418c2d4b6e46-ovnkube-script-lib\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.372737 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-run-ovn\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.372768 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-slash\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.372770 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.372793 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-systemd-units\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.372806 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-etc-openvswitch\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.373022 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-run-systemd\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.373323 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/11546b62-cdda-449d-963e-418c2d4b6e46-ovnkube-config\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.373347 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-kubelet\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.373707 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/11546b62-cdda-449d-963e-418c2d4b6e46-env-overrides\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.379869 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.382549 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/11546b62-cdda-449d-963e-418c2d4b6e46-ovn-node-metrics-cert\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.385407 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xvdcs" Oct 07 19:00:32 crc kubenswrapper[4825]: W1007 19:00:32.390285 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-fea06221c6d40275d7b8715b4f903dc770d505d7c22b1336f63e5caca640c614 WatchSource:0}: Error finding container fea06221c6d40275d7b8715b4f903dc770d505d7c22b1336f63e5caca640c614: Status 404 returned error can't find the container with id fea06221c6d40275d7b8715b4f903dc770d505d7c22b1336f63e5caca640c614 Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.394981 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmmv8\" (UniqueName: \"kubernetes.io/projected/11546b62-cdda-449d-963e-418c2d4b6e46-kube-api-access-qmmv8\") pod \"ovnkube-node-6lvdm\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.403657 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.417243 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.428494 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.439908 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.442491 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.458917 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.472863 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.473005 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.473041 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:32 crc kubenswrapper[4825]: E1007 19:00:32.473138 4825 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 19:00:32 crc kubenswrapper[4825]: E1007 19:00:32.473190 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 19:00:33.473176618 +0000 UTC m=+22.295215255 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 19:00:32 crc kubenswrapper[4825]: E1007 19:00:32.473322 4825 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 19:00:32 crc kubenswrapper[4825]: E1007 19:00:32.473368 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:00:33.473332103 +0000 UTC m=+22.295370740 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:00:32 crc kubenswrapper[4825]: E1007 19:00:32.473419 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 19:00:33.473407215 +0000 UTC m=+22.295446082 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.496340 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.526200 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.534916 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: W1007 19:00:32.539436 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11546b62_cdda_449d_963e_418c2d4b6e46.slice/crio-a1ebad1f97a9efe415c351baad2fd4e11338df5fb533ee22b295891820bc5a21 WatchSource:0}: Error finding container a1ebad1f97a9efe415c351baad2fd4e11338df5fb533ee22b295891820bc5a21: Status 404 returned error can't find the container with id a1ebad1f97a9efe415c351baad2fd4e11338df5fb533ee22b295891820bc5a21 Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.573711 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.573797 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:00:32 crc kubenswrapper[4825]: E1007 19:00:32.573898 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 19:00:32 crc kubenswrapper[4825]: E1007 19:00:32.573936 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 19:00:32 crc kubenswrapper[4825]: E1007 19:00:32.573947 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 19:00:32 crc kubenswrapper[4825]: E1007 19:00:32.573970 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 19:00:32 crc kubenswrapper[4825]: E1007 19:00:32.573982 4825 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:00:32 crc kubenswrapper[4825]: E1007 19:00:32.573951 4825 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:00:32 crc kubenswrapper[4825]: E1007 19:00:32.574035 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 19:00:33.574019257 +0000 UTC m=+22.396057894 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:00:32 crc kubenswrapper[4825]: E1007 19:00:32.574162 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 19:00:33.574145141 +0000 UTC m=+22.396183778 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.579928 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.612881 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.951550 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.954490 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba"} Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.954737 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.955628 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xvdcs" event={"ID":"3f038b04-14c9-421c-91e9-ab654b6c4ac8","Type":"ContainerStarted","Data":"e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac"} Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.955659 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xvdcs" event={"ID":"3f038b04-14c9-421c-91e9-ab654b6c4ac8","Type":"ContainerStarted","Data":"457eb8ce5eb72c0fa9efe9991ed9eb6cb162f361392015fd49ba4140a2ac1e72"} Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.956719 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"fea06221c6d40275d7b8715b4f903dc770d505d7c22b1336f63e5caca640c614"} Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.957761 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zk9x9" event={"ID":"44f62e96-26a6-4bfe-8e8c-6884216bd363","Type":"ContainerStarted","Data":"ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71"} Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.957791 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zk9x9" event={"ID":"44f62e96-26a6-4bfe-8e8c-6884216bd363","Type":"ContainerStarted","Data":"5315343a99827990084f7b04390314540a04c660b2924bbfa1375e5de3cddf35"} Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.960103 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerStarted","Data":"8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de"} Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.960133 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerStarted","Data":"e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954"} Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.960149 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerStarted","Data":"cbc110571e60b8bd56aa4810db1dbd5d2eb4db59470fbeaea69a7a45f6e44534"} Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.961631 4825 generic.go:334] "Generic (PLEG): container finished" podID="e48a4135-d1b9-4dfb-89fc-be393f7937aa" containerID="fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31" exitCode=0 Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.961686 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" event={"ID":"e48a4135-d1b9-4dfb-89fc-be393f7937aa","Type":"ContainerDied","Data":"fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31"} Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.961705 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" event={"ID":"e48a4135-d1b9-4dfb-89fc-be393f7937aa","Type":"ContainerStarted","Data":"cf6ddbe303411b3eccfe8b3c0ca6e2cfb5c06dff89a307601edb943b0a5c9860"} Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.967582 4825 generic.go:334] "Generic (PLEG): container finished" podID="11546b62-cdda-449d-963e-418c2d4b6e46" containerID="7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c" exitCode=0 Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.967642 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerDied","Data":"7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c"} Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.967663 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerStarted","Data":"a1ebad1f97a9efe415c351baad2fd4e11338df5fb533ee22b295891820bc5a21"} Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.969382 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638"} Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.969417 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8ac30b0d4b1705fc03b255c1b0c3c579eb8bd2f5033f000ae1ecbbd4f3caff65"} Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.971306 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218"} Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.971332 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1"} Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.971342 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ce83556b0f4ce66178e6da82862c0c43cc1379c5496c374a6a393e65542ec737"} Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.974666 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.983553 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:32 crc kubenswrapper[4825]: I1007 19:00:32.995968 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.006370 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.024884 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.037300 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.048058 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.057149 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.069268 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.086250 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.098138 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.107102 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.133640 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.178547 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.221132 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.260146 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.298345 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.346855 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.388814 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.416590 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.458250 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.484217 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:00:33 crc kubenswrapper[4825]: E1007 19:00:33.484351 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:00:35.484330399 +0000 UTC m=+24.306369036 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.484405 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.484448 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:33 crc kubenswrapper[4825]: E1007 19:00:33.484537 4825 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 19:00:33 crc kubenswrapper[4825]: E1007 19:00:33.484578 4825 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 19:00:33 crc kubenswrapper[4825]: E1007 19:00:33.484591 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 19:00:35.484581227 +0000 UTC m=+24.306619864 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 19:00:33 crc kubenswrapper[4825]: E1007 19:00:33.484616 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 19:00:35.484606797 +0000 UTC m=+24.306645434 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.495894 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.538517 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.582722 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.585545 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.585641 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:00:33 crc kubenswrapper[4825]: E1007 19:00:33.585799 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 19:00:33 crc kubenswrapper[4825]: E1007 19:00:33.585848 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 19:00:33 crc kubenswrapper[4825]: E1007 19:00:33.585861 4825 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:00:33 crc kubenswrapper[4825]: E1007 19:00:33.585812 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 19:00:33 crc kubenswrapper[4825]: E1007 19:00:33.585893 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 19:00:33 crc kubenswrapper[4825]: E1007 19:00:33.585908 4825 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:00:33 crc kubenswrapper[4825]: E1007 19:00:33.585926 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 19:00:35.585904692 +0000 UTC m=+24.407943329 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:00:33 crc kubenswrapper[4825]: E1007 19:00:33.585968 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 19:00:35.585949013 +0000 UTC m=+24.407987640 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.794752 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:00:33 crc kubenswrapper[4825]: E1007 19:00:33.795206 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.794804 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.794830 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:00:33 crc kubenswrapper[4825]: E1007 19:00:33.795378 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:00:33 crc kubenswrapper[4825]: E1007 19:00:33.795462 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.802448 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.803290 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.804541 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.805189 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.806258 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.806798 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.807469 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.808478 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.809166 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.810086 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.810618 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.811811 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.812364 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.812882 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.813924 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.814458 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.815486 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.815932 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.816500 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.818208 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.818772 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.820111 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.820611 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.821896 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.823896 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.824686 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.827996 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.828509 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.829675 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.830184 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.831285 4825 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.831393 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.832974 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.834018 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.834548 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.836183 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.836898 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.837823 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.838657 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.839768 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.840287 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.841464 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.842516 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.843126 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.843640 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.844636 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.845554 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.846281 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.846782 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.847783 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.848467 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.849579 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.850309 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.850923 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.978914 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerStarted","Data":"f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8"} Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.978976 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerStarted","Data":"6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2"} Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.978993 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerStarted","Data":"a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04"} Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.979004 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerStarted","Data":"913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab"} Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.980972 4825 generic.go:334] "Generic (PLEG): container finished" podID="e48a4135-d1b9-4dfb-89fc-be393f7937aa" containerID="62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9" exitCode=0 Oct 07 19:00:33 crc kubenswrapper[4825]: I1007 19:00:33.981100 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" event={"ID":"e48a4135-d1b9-4dfb-89fc-be393f7937aa","Type":"ContainerDied","Data":"62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9"} Oct 07 19:00:34 crc kubenswrapper[4825]: I1007 19:00:34.007743 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:34Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:34 crc kubenswrapper[4825]: I1007 19:00:34.025327 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:34Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:34 crc kubenswrapper[4825]: I1007 19:00:34.038022 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:34Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:34 crc kubenswrapper[4825]: I1007 19:00:34.061900 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:34Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:34 crc kubenswrapper[4825]: I1007 19:00:34.078036 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:34Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:34 crc kubenswrapper[4825]: I1007 19:00:34.095758 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:34Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:34 crc kubenswrapper[4825]: I1007 19:00:34.109156 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:34Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:34 crc kubenswrapper[4825]: I1007 19:00:34.121663 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:34Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:34 crc kubenswrapper[4825]: I1007 19:00:34.147790 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:34Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:34 crc kubenswrapper[4825]: I1007 19:00:34.172968 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:34Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:34 crc kubenswrapper[4825]: I1007 19:00:34.196922 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:34Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:34 crc kubenswrapper[4825]: I1007 19:00:34.215478 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:34Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:34 crc kubenswrapper[4825]: I1007 19:00:34.990090 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerStarted","Data":"392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b"} Oct 07 19:00:34 crc kubenswrapper[4825]: I1007 19:00:34.990167 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerStarted","Data":"f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc"} Oct 07 19:00:34 crc kubenswrapper[4825]: I1007 19:00:34.993495 4825 generic.go:334] "Generic (PLEG): container finished" podID="e48a4135-d1b9-4dfb-89fc-be393f7937aa" containerID="e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312" exitCode=0 Oct 07 19:00:34 crc kubenswrapper[4825]: I1007 19:00:34.993548 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" event={"ID":"e48a4135-d1b9-4dfb-89fc-be393f7937aa","Type":"ContainerDied","Data":"e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312"} Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.021306 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:35Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.040809 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:35Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.067439 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:35Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.089733 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:35Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.112950 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:35Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.134238 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:35Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.147834 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:35Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.169904 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:35Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.191154 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:35Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.206574 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:35Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.221972 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:35Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.234195 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:35Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.509749 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.509923 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:35 crc kubenswrapper[4825]: E1007 19:00:35.510053 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:00:39.510004347 +0000 UTC m=+28.332042984 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.510141 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:35 crc kubenswrapper[4825]: E1007 19:00:35.510165 4825 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 19:00:35 crc kubenswrapper[4825]: E1007 19:00:35.510313 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 19:00:39.510282626 +0000 UTC m=+28.332321293 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 19:00:35 crc kubenswrapper[4825]: E1007 19:00:35.510372 4825 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 19:00:35 crc kubenswrapper[4825]: E1007 19:00:35.510462 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 19:00:39.510441951 +0000 UTC m=+28.332480618 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.611669 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.611734 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:00:35 crc kubenswrapper[4825]: E1007 19:00:35.611906 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 19:00:35 crc kubenswrapper[4825]: E1007 19:00:35.611969 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 19:00:35 crc kubenswrapper[4825]: E1007 19:00:35.611983 4825 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:00:35 crc kubenswrapper[4825]: E1007 19:00:35.612048 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 19:00:39.612029195 +0000 UTC m=+28.434067832 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:00:35 crc kubenswrapper[4825]: E1007 19:00:35.612641 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 19:00:35 crc kubenswrapper[4825]: E1007 19:00:35.612694 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 19:00:35 crc kubenswrapper[4825]: E1007 19:00:35.612710 4825 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:00:35 crc kubenswrapper[4825]: E1007 19:00:35.612811 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 19:00:39.612785379 +0000 UTC m=+28.434824016 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.794799 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.794904 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:35 crc kubenswrapper[4825]: E1007 19:00:35.794959 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:00:35 crc kubenswrapper[4825]: E1007 19:00:35.795097 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.795172 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:00:35 crc kubenswrapper[4825]: E1007 19:00:35.795321 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.803153 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.826508 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.827530 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:35Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.829189 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.846940 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:35Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.867970 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:35Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.888965 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:35Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.906464 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:35Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.925863 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:35Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.955106 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:35Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.973973 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:35Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:35 crc kubenswrapper[4825]: I1007 19:00:35.996210 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:35Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.001362 4825 generic.go:334] "Generic (PLEG): container finished" podID="e48a4135-d1b9-4dfb-89fc-be393f7937aa" containerID="1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b" exitCode=0 Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.001775 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" event={"ID":"e48a4135-d1b9-4dfb-89fc-be393f7937aa","Type":"ContainerDied","Data":"1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b"} Oct 07 19:00:36 crc kubenswrapper[4825]: E1007 19:00:36.014498 4825 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.014623 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:36Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.046185 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:36Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.064730 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:36Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.078161 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:36Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.091120 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:36Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.103447 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:36Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.112006 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:36Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.125299 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:36Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.137000 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:36Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.150623 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:36Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.171559 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:36Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.182813 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:36Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.196031 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:36Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.222366 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:36Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.242443 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:36Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.256156 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:36Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.850824 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.855087 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.866684 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.885610 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:36Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.905261 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:36Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.928774 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:36Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.950072 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:36Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.971908 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:36Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:36 crc kubenswrapper[4825]: I1007 19:00:36.988727 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:36Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.006990 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6"} Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.013038 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerStarted","Data":"f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04"} Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.016427 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.018039 4825 generic.go:334] "Generic (PLEG): container finished" podID="e48a4135-d1b9-4dfb-89fc-be393f7937aa" containerID="3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560" exitCode=0 Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.019417 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" event={"ID":"e48a4135-d1b9-4dfb-89fc-be393f7937aa","Type":"ContainerDied","Data":"3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560"} Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.037560 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.061460 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.078717 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.097814 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.109117 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.120077 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.138266 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.152561 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.182020 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.197399 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.210097 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.235838 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.250732 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.265538 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.279896 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.291980 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.305825 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.317526 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.339078 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.353196 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.511462 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-vtrsb"] Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.512264 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vtrsb" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.514709 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.514735 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.514916 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.517207 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.530308 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.551890 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.567878 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.600318 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.618098 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.635645 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c0366d9-864d-4de0-8482-9d0a061fcd6f-host\") pod \"node-ca-vtrsb\" (UID: \"5c0366d9-864d-4de0-8482-9d0a061fcd6f\") " pod="openshift-image-registry/node-ca-vtrsb" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.635722 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgzvz\" (UniqueName: \"kubernetes.io/projected/5c0366d9-864d-4de0-8482-9d0a061fcd6f-kube-api-access-dgzvz\") pod \"node-ca-vtrsb\" (UID: \"5c0366d9-864d-4de0-8482-9d0a061fcd6f\") " pod="openshift-image-registry/node-ca-vtrsb" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.635796 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5c0366d9-864d-4de0-8482-9d0a061fcd6f-serviceca\") pod \"node-ca-vtrsb\" (UID: \"5c0366d9-864d-4de0-8482-9d0a061fcd6f\") " pod="openshift-image-registry/node-ca-vtrsb" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.637069 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.657972 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.676087 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.699632 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.719923 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.722385 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.725321 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.725388 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.725413 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.725616 4825 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.735760 4825 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.736089 4825 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.737149 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgzvz\" (UniqueName: \"kubernetes.io/projected/5c0366d9-864d-4de0-8482-9d0a061fcd6f-kube-api-access-dgzvz\") pod \"node-ca-vtrsb\" (UID: \"5c0366d9-864d-4de0-8482-9d0a061fcd6f\") " pod="openshift-image-registry/node-ca-vtrsb" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.737259 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5c0366d9-864d-4de0-8482-9d0a061fcd6f-serviceca\") pod \"node-ca-vtrsb\" (UID: \"5c0366d9-864d-4de0-8482-9d0a061fcd6f\") " pod="openshift-image-registry/node-ca-vtrsb" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.737354 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c0366d9-864d-4de0-8482-9d0a061fcd6f-host\") pod \"node-ca-vtrsb\" (UID: \"5c0366d9-864d-4de0-8482-9d0a061fcd6f\") " pod="openshift-image-registry/node-ca-vtrsb" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.737435 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c0366d9-864d-4de0-8482-9d0a061fcd6f-host\") pod \"node-ca-vtrsb\" (UID: \"5c0366d9-864d-4de0-8482-9d0a061fcd6f\") " pod="openshift-image-registry/node-ca-vtrsb" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.738133 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.738203 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.738224 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.738282 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.738301 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:37Z","lastTransitionTime":"2025-10-07T19:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.738623 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5c0366d9-864d-4de0-8482-9d0a061fcd6f-serviceca\") pod \"node-ca-vtrsb\" (UID: \"5c0366d9-864d-4de0-8482-9d0a061fcd6f\") " pod="openshift-image-registry/node-ca-vtrsb" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.740347 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: E1007 19:00:37.755029 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.759614 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.759669 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.759689 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.759715 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.759734 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:37Z","lastTransitionTime":"2025-10-07T19:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.772310 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.774427 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgzvz\" (UniqueName: \"kubernetes.io/projected/5c0366d9-864d-4de0-8482-9d0a061fcd6f-kube-api-access-dgzvz\") pod \"node-ca-vtrsb\" (UID: \"5c0366d9-864d-4de0-8482-9d0a061fcd6f\") " pod="openshift-image-registry/node-ca-vtrsb" Oct 07 19:00:37 crc kubenswrapper[4825]: E1007 19:00:37.782965 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.787360 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.787421 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.787440 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.787465 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.787482 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:37Z","lastTransitionTime":"2025-10-07T19:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.789171 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.794875 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.794922 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.794888 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:37 crc kubenswrapper[4825]: E1007 19:00:37.795039 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:00:37 crc kubenswrapper[4825]: E1007 19:00:37.795169 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:00:37 crc kubenswrapper[4825]: E1007 19:00:37.795354 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:00:37 crc kubenswrapper[4825]: E1007 19:00:37.804846 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.807896 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.807936 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.807961 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.807982 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.807998 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:37Z","lastTransitionTime":"2025-10-07T19:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.809570 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: E1007 19:00:37.819154 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.820332 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.823117 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.823162 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.823183 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.823205 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.823218 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:37Z","lastTransitionTime":"2025-10-07T19:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.828053 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vtrsb" Oct 07 19:00:37 crc kubenswrapper[4825]: E1007 19:00:37.841997 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:37Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:37 crc kubenswrapper[4825]: E1007 19:00:37.842205 4825 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.844318 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.844374 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.844390 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.844419 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.844436 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:37Z","lastTransitionTime":"2025-10-07T19:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:37 crc kubenswrapper[4825]: W1007 19:00:37.847137 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c0366d9_864d_4de0_8482_9d0a061fcd6f.slice/crio-30ad8f7b224ba8df75610ce1689066f3dadcb2ee83d58537a4aaf70d8cbdc175 WatchSource:0}: Error finding container 30ad8f7b224ba8df75610ce1689066f3dadcb2ee83d58537a4aaf70d8cbdc175: Status 404 returned error can't find the container with id 30ad8f7b224ba8df75610ce1689066f3dadcb2ee83d58537a4aaf70d8cbdc175 Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.947333 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.947384 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.947395 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.947412 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:37 crc kubenswrapper[4825]: I1007 19:00:37.947424 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:37Z","lastTransitionTime":"2025-10-07T19:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.023360 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vtrsb" event={"ID":"5c0366d9-864d-4de0-8482-9d0a061fcd6f","Type":"ContainerStarted","Data":"30ad8f7b224ba8df75610ce1689066f3dadcb2ee83d58537a4aaf70d8cbdc175"} Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.027465 4825 generic.go:334] "Generic (PLEG): container finished" podID="e48a4135-d1b9-4dfb-89fc-be393f7937aa" containerID="33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a" exitCode=0 Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.027562 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" event={"ID":"e48a4135-d1b9-4dfb-89fc-be393f7937aa","Type":"ContainerDied","Data":"33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a"} Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.042835 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:38Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.051255 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.051304 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.051315 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.051336 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.051347 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:38Z","lastTransitionTime":"2025-10-07T19:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.059610 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:38Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.078310 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:38Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.090734 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:38Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.102455 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:38Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.118189 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:38Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.130960 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:38Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.143811 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:38Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.154674 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.154724 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.154736 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.154757 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.154772 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:38Z","lastTransitionTime":"2025-10-07T19:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.159123 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:38Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.186255 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:38Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.198939 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:38Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.210968 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:38Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.228530 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:38Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.244092 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:38Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.258008 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.258474 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.258703 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.258894 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.259088 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:38Z","lastTransitionTime":"2025-10-07T19:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.267060 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:38Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.362062 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.362105 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.362115 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.362133 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.362147 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:38Z","lastTransitionTime":"2025-10-07T19:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.465262 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.465341 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.465365 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.465397 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.465419 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:38Z","lastTransitionTime":"2025-10-07T19:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.568257 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.568717 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.568845 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.568971 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.569062 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:38Z","lastTransitionTime":"2025-10-07T19:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.671217 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.671321 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.671345 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.671379 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.671398 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:38Z","lastTransitionTime":"2025-10-07T19:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.774849 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.774910 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.774923 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.774946 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.774962 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:38Z","lastTransitionTime":"2025-10-07T19:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.879243 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.879298 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.879307 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.879329 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.879346 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:38Z","lastTransitionTime":"2025-10-07T19:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.982971 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.983057 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.983082 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.983112 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:38 crc kubenswrapper[4825]: I1007 19:00:38.983134 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:38Z","lastTransitionTime":"2025-10-07T19:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.086285 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.086349 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.086368 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.086397 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.086417 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:39Z","lastTransitionTime":"2025-10-07T19:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.191270 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.191729 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.191741 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.191761 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.191773 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:39Z","lastTransitionTime":"2025-10-07T19:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.295440 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.295523 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.295544 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.295576 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.295595 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:39Z","lastTransitionTime":"2025-10-07T19:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.399527 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.399619 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.399645 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.399680 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.399703 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:39Z","lastTransitionTime":"2025-10-07T19:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.504505 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.504616 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.504644 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.504673 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.504699 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:39Z","lastTransitionTime":"2025-10-07T19:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.555680 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.555891 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:39 crc kubenswrapper[4825]: E1007 19:00:39.556029 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:00:47.555981995 +0000 UTC m=+36.378020662 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:00:39 crc kubenswrapper[4825]: E1007 19:00:39.556099 4825 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.556115 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:39 crc kubenswrapper[4825]: E1007 19:00:39.556195 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 19:00:47.556170141 +0000 UTC m=+36.378208818 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 19:00:39 crc kubenswrapper[4825]: E1007 19:00:39.556302 4825 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 19:00:39 crc kubenswrapper[4825]: E1007 19:00:39.556428 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 19:00:47.556394018 +0000 UTC m=+36.378432865 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.608343 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.608404 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.608429 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.608461 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.608483 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:39Z","lastTransitionTime":"2025-10-07T19:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.657913 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.657989 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:00:39 crc kubenswrapper[4825]: E1007 19:00:39.658149 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 19:00:39 crc kubenswrapper[4825]: E1007 19:00:39.658202 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 19:00:39 crc kubenswrapper[4825]: E1007 19:00:39.658220 4825 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:00:39 crc kubenswrapper[4825]: E1007 19:00:39.658286 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 19:00:39 crc kubenswrapper[4825]: E1007 19:00:39.658340 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 19:00:47.658317202 +0000 UTC m=+36.480355859 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:00:39 crc kubenswrapper[4825]: E1007 19:00:39.658342 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 19:00:39 crc kubenswrapper[4825]: E1007 19:00:39.658376 4825 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:00:39 crc kubenswrapper[4825]: E1007 19:00:39.658483 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 19:00:47.658450306 +0000 UTC m=+36.480489143 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.710984 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.711047 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.711067 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.711093 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.711113 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:39Z","lastTransitionTime":"2025-10-07T19:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.795473 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.795509 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.795576 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:39 crc kubenswrapper[4825]: E1007 19:00:39.795697 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:00:39 crc kubenswrapper[4825]: E1007 19:00:39.795866 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:00:39 crc kubenswrapper[4825]: E1007 19:00:39.796166 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.814173 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.814307 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.814343 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.814389 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.814416 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:39Z","lastTransitionTime":"2025-10-07T19:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.917563 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.917618 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.917638 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.917666 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:39 crc kubenswrapper[4825]: I1007 19:00:39.917686 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:39Z","lastTransitionTime":"2025-10-07T19:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.020788 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.020851 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.020862 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.020882 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.020895 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:40Z","lastTransitionTime":"2025-10-07T19:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.039878 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" event={"ID":"e48a4135-d1b9-4dfb-89fc-be393f7937aa","Type":"ContainerStarted","Data":"118b6f70278ab0bc5e10ad653b675b5790a88df552124be3fe509514c6d59a25"} Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.046599 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerStarted","Data":"d9383a24482e5941ddd4e710bd11a236d81e75ff818fc493f6560e0120cbba10"} Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.046937 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.049335 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vtrsb" event={"ID":"5c0366d9-864d-4de0-8482-9d0a061fcd6f","Type":"ContainerStarted","Data":"6b3c122804372bb1842a362067d274a1debd124b633605dbf43d21d52688ef96"} Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.051578 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.072731 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://118b6f70278ab0bc5e10ad653b675b5790a88df552124be3fe509514c6d59a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.088911 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.106227 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.123289 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.123344 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.123361 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.123394 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.123412 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:40Z","lastTransitionTime":"2025-10-07T19:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.130559 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.137016 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.152475 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.174755 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.201555 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.221933 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.226428 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.226492 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.226511 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.226537 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.226554 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:40Z","lastTransitionTime":"2025-10-07T19:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.246991 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.268810 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.298662 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.316597 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.329424 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.329469 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.329480 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.329497 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.329513 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:40Z","lastTransitionTime":"2025-10-07T19:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.339566 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.354868 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.376771 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.396909 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.420803 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.433464 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.433827 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.433863 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.433888 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.433903 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:40Z","lastTransitionTime":"2025-10-07T19:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.453438 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9383a24482e5941ddd4e710bd11a236d81e75ff818fc493f6560e0120cbba10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.475278 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.491588 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.507417 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3c122804372bb1842a362067d274a1debd124b633605dbf43d21d52688ef96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.526990 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.536999 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.537069 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.537082 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.537104 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.537118 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:40Z","lastTransitionTime":"2025-10-07T19:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.550510 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://118b6f70278ab0bc5e10ad653b675b5790a88df552124be3fe509514c6d59a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.569069 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.582794 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.607869 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.627138 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.640528 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.640609 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.640635 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.640668 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.640691 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:40Z","lastTransitionTime":"2025-10-07T19:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.646454 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.665767 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:40Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.744207 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.744312 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.744331 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.744356 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.744377 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:40Z","lastTransitionTime":"2025-10-07T19:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.851026 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.851094 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.851119 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.851150 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.851171 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:40Z","lastTransitionTime":"2025-10-07T19:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.959760 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.960064 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.960159 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.960264 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:40 crc kubenswrapper[4825]: I1007 19:00:40.960349 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:40Z","lastTransitionTime":"2025-10-07T19:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.053015 4825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.053518 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.063760 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.063896 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.064007 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.064088 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.064189 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:41Z","lastTransitionTime":"2025-10-07T19:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.085617 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.104010 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:41Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.127508 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://118b6f70278ab0bc5e10ad653b675b5790a88df552124be3fe509514c6d59a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:41Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.145422 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:41Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.167007 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:41Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.167738 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.167897 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.168008 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.168138 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.168276 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:41Z","lastTransitionTime":"2025-10-07T19:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.190293 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:41Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.222380 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:41Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.238147 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:41Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.261441 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:41Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.271023 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.271094 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.271113 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.271671 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.271730 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:41Z","lastTransitionTime":"2025-10-07T19:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.284977 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9383a24482e5941ddd4e710bd11a236d81e75ff818fc493f6560e0120cbba10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:41Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.304784 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:41Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.321860 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:41Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.341043 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:41Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.362017 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:41Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.374660 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.374751 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.374780 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.374817 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.374847 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:41Z","lastTransitionTime":"2025-10-07T19:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.379041 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:41Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.391657 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3c122804372bb1842a362067d274a1debd124b633605dbf43d21d52688ef96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:41Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.478064 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.478125 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.478144 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.478170 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.478187 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:41Z","lastTransitionTime":"2025-10-07T19:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.580985 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.581071 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.581090 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.581118 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.581135 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:41Z","lastTransitionTime":"2025-10-07T19:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.684008 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.684105 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.684130 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.684165 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.684185 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:41Z","lastTransitionTime":"2025-10-07T19:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.787572 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.787644 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.787668 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.787697 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.787722 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:41Z","lastTransitionTime":"2025-10-07T19:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.795390 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:00:41 crc kubenswrapper[4825]: E1007 19:00:41.795612 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.795699 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.795753 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:00:41 crc kubenswrapper[4825]: E1007 19:00:41.795931 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:00:41 crc kubenswrapper[4825]: E1007 19:00:41.796086 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.826153 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:41Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.853570 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:41Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.872442 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3c122804372bb1842a362067d274a1debd124b633605dbf43d21d52688ef96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:41Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.890258 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.890291 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.890303 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.890320 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.890332 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:41Z","lastTransitionTime":"2025-10-07T19:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.902791 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:41Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.922973 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://118b6f70278ab0bc5e10ad653b675b5790a88df552124be3fe509514c6d59a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:41Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.943596 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:41Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.973288 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:41Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.992412 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.992456 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.992466 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.992519 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.992531 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:41Z","lastTransitionTime":"2025-10-07T19:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:41 crc kubenswrapper[4825]: I1007 19:00:41.995052 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:41Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.016195 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:42Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.026427 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:42Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.042694 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9383a24482e5941ddd4e710bd11a236d81e75ff818fc493f6560e0120cbba10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:42Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.054342 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:42Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.057074 4825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.065724 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:42Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.078050 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:42Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.091285 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:42Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.095025 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.095095 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.095108 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.095327 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.095353 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:42Z","lastTransitionTime":"2025-10-07T19:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.198741 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.198783 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.198795 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.198814 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.198825 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:42Z","lastTransitionTime":"2025-10-07T19:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.301746 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.301810 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.301829 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.301856 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.301877 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:42Z","lastTransitionTime":"2025-10-07T19:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.405852 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.405950 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.405977 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.406007 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.406032 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:42Z","lastTransitionTime":"2025-10-07T19:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.509323 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.509389 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.509407 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.509436 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.509454 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:42Z","lastTransitionTime":"2025-10-07T19:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.611931 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.611967 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.611978 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.611995 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.612010 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:42Z","lastTransitionTime":"2025-10-07T19:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.715294 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.715364 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.715383 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.715409 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.715425 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:42Z","lastTransitionTime":"2025-10-07T19:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.819038 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.819149 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.819173 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.819274 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.819301 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:42Z","lastTransitionTime":"2025-10-07T19:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.921951 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.922025 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.922049 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.922098 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:42 crc kubenswrapper[4825]: I1007 19:00:42.922140 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:42Z","lastTransitionTime":"2025-10-07T19:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.025127 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.025174 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.025191 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.025213 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.025259 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:43Z","lastTransitionTime":"2025-10-07T19:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.062606 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6lvdm_11546b62-cdda-449d-963e-418c2d4b6e46/ovnkube-controller/0.log" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.066033 4825 generic.go:334] "Generic (PLEG): container finished" podID="11546b62-cdda-449d-963e-418c2d4b6e46" containerID="d9383a24482e5941ddd4e710bd11a236d81e75ff818fc493f6560e0120cbba10" exitCode=1 Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.066090 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerDied","Data":"d9383a24482e5941ddd4e710bd11a236d81e75ff818fc493f6560e0120cbba10"} Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.066962 4825 scope.go:117] "RemoveContainer" containerID="d9383a24482e5941ddd4e710bd11a236d81e75ff818fc493f6560e0120cbba10" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.092824 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:43Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.109496 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3c122804372bb1842a362067d274a1debd124b633605dbf43d21d52688ef96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:43Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.127728 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:43Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.128779 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.128832 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.128845 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.128864 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.128875 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:43Z","lastTransitionTime":"2025-10-07T19:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.152342 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://118b6f70278ab0bc5e10ad653b675b5790a88df552124be3fe509514c6d59a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:43Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.170804 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:43Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.182664 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:43Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.211202 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:43Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.231921 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.231968 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.231979 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.231995 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.232005 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:43Z","lastTransitionTime":"2025-10-07T19:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.233008 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:43Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.254155 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:43Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.273037 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:43Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.292454 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:43Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.312302 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:43Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.325907 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:43Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.334610 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.334663 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.334681 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.334705 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.334723 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:43Z","lastTransitionTime":"2025-10-07T19:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.352192 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9383a24482e5941ddd4e710bd11a236d81e75ff818fc493f6560e0120cbba10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9383a24482e5941ddd4e710bd11a236d81e75ff818fc493f6560e0120cbba10\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:00:42Z\\\",\\\"message\\\":\\\" 6063 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 19:00:42.600345 6063 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 19:00:42.600380 6063 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 19:00:42.600403 6063 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 19:00:42.600410 6063 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 19:00:42.600428 6063 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:42.600445 6063 factory.go:656] Stopping watch factory\\\\nI1007 19:00:42.600449 6063 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 19:00:42.600467 6063 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 19:00:42.600468 6063 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 19:00:42.600476 6063 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 19:00:42.600487 6063 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 19:00:42.600498 6063 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 19:00:42.600506 6063 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 19:00:42.600775 6063 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:43Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.375605 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:43Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.436660 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.436704 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.436718 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.436736 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.436749 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:43Z","lastTransitionTime":"2025-10-07T19:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.539437 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.539494 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.539513 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.539536 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.539553 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:43Z","lastTransitionTime":"2025-10-07T19:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.642535 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.642591 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.642605 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.642622 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.642637 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:43Z","lastTransitionTime":"2025-10-07T19:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.745910 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.745964 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.745982 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.746006 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.746024 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:43Z","lastTransitionTime":"2025-10-07T19:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.795082 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.795154 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.795104 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:43 crc kubenswrapper[4825]: E1007 19:00:43.795316 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:00:43 crc kubenswrapper[4825]: E1007 19:00:43.795465 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:00:43 crc kubenswrapper[4825]: E1007 19:00:43.795568 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.848551 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.848653 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.848669 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.848688 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.848700 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:43Z","lastTransitionTime":"2025-10-07T19:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.951501 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.951545 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.951558 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.951577 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:43 crc kubenswrapper[4825]: I1007 19:00:43.951591 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:43Z","lastTransitionTime":"2025-10-07T19:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.054070 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.054142 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.054161 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.054187 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.054210 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:44Z","lastTransitionTime":"2025-10-07T19:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.072666 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6lvdm_11546b62-cdda-449d-963e-418c2d4b6e46/ovnkube-controller/0.log" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.076429 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerStarted","Data":"b4c9c4fdf4419280e9c033097e4536865098594e3b8d25fd0918b45a8b436112"} Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.076623 4825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.095368 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.106179 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3c122804372bb1842a362067d274a1debd124b633605dbf43d21d52688ef96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.123094 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.140986 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://118b6f70278ab0bc5e10ad653b675b5790a88df552124be3fe509514c6d59a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.157273 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.157323 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.157334 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.157354 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.157366 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:44Z","lastTransitionTime":"2025-10-07T19:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.161086 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.176004 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.207399 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.232551 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.251411 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.261249 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.261309 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.261323 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.261342 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.261356 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:44Z","lastTransitionTime":"2025-10-07T19:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.274178 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.290084 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.308238 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.327337 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.357335 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c9c4fdf4419280e9c033097e4536865098594e3b8d25fd0918b45a8b436112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9383a24482e5941ddd4e710bd11a236d81e75ff818fc493f6560e0120cbba10\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:00:42Z\\\",\\\"message\\\":\\\" 6063 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 19:00:42.600345 6063 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 19:00:42.600380 6063 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 19:00:42.600403 6063 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 19:00:42.600410 6063 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 19:00:42.600428 6063 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:42.600445 6063 factory.go:656] Stopping watch factory\\\\nI1007 19:00:42.600449 6063 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 19:00:42.600467 6063 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 19:00:42.600468 6063 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 19:00:42.600476 6063 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 19:00:42.600487 6063 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 19:00:42.600498 6063 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 19:00:42.600506 6063 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 19:00:42.600775 6063 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.366747 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.366832 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.366857 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.366890 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.366928 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:44Z","lastTransitionTime":"2025-10-07T19:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.382892 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.384589 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr"] Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.385089 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.388378 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.388578 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.402378 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.424303 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.435210 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.447407 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.456036 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5c4jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.471419 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.471452 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.471462 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.471475 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.471485 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:44Z","lastTransitionTime":"2025-10-07T19:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.472197 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.490695 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.509035 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.514983 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxdpr\" (UniqueName: \"kubernetes.io/projected/2d90e25a-d8b6-4a4c-9948-c8ea3b38996c-kube-api-access-lxdpr\") pod \"ovnkube-control-plane-749d76644c-5c4jr\" (UID: \"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.515064 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d90e25a-d8b6-4a4c-9948-c8ea3b38996c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-5c4jr\" (UID: \"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.515144 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d90e25a-d8b6-4a4c-9948-c8ea3b38996c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-5c4jr\" (UID: \"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.515191 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d90e25a-d8b6-4a4c-9948-c8ea3b38996c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-5c4jr\" (UID: \"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.524181 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.539047 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c9c4fdf4419280e9c033097e4536865098594e3b8d25fd0918b45a8b436112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9383a24482e5941ddd4e710bd11a236d81e75ff818fc493f6560e0120cbba10\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:00:42Z\\\",\\\"message\\\":\\\" 6063 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 19:00:42.600345 6063 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 19:00:42.600380 6063 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 19:00:42.600403 6063 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 19:00:42.600410 6063 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 19:00:42.600428 6063 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:42.600445 6063 factory.go:656] Stopping watch factory\\\\nI1007 19:00:42.600449 6063 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 19:00:42.600467 6063 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 19:00:42.600468 6063 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 19:00:42.600476 6063 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 19:00:42.600487 6063 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 19:00:42.600498 6063 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 19:00:42.600506 6063 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 19:00:42.600775 6063 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.548622 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.559050 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.568259 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3c122804372bb1842a362067d274a1debd124b633605dbf43d21d52688ef96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.574918 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.574953 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.574962 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.574976 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.574988 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:44Z","lastTransitionTime":"2025-10-07T19:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.576484 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.588596 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://118b6f70278ab0bc5e10ad653b675b5790a88df552124be3fe509514c6d59a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.597955 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:44Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.615612 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d90e25a-d8b6-4a4c-9948-c8ea3b38996c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-5c4jr\" (UID: \"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.615691 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d90e25a-d8b6-4a4c-9948-c8ea3b38996c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-5c4jr\" (UID: \"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.615750 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxdpr\" (UniqueName: \"kubernetes.io/projected/2d90e25a-d8b6-4a4c-9948-c8ea3b38996c-kube-api-access-lxdpr\") pod \"ovnkube-control-plane-749d76644c-5c4jr\" (UID: \"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.615789 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d90e25a-d8b6-4a4c-9948-c8ea3b38996c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-5c4jr\" (UID: \"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.616280 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d90e25a-d8b6-4a4c-9948-c8ea3b38996c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-5c4jr\" (UID: \"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.616916 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d90e25a-d8b6-4a4c-9948-c8ea3b38996c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-5c4jr\" (UID: \"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.621822 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d90e25a-d8b6-4a4c-9948-c8ea3b38996c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-5c4jr\" (UID: \"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.630744 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxdpr\" (UniqueName: \"kubernetes.io/projected/2d90e25a-d8b6-4a4c-9948-c8ea3b38996c-kube-api-access-lxdpr\") pod \"ovnkube-control-plane-749d76644c-5c4jr\" (UID: \"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.678092 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.678136 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.678144 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.678160 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.678171 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:44Z","lastTransitionTime":"2025-10-07T19:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.698507 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" Oct 07 19:00:44 crc kubenswrapper[4825]: W1007 19:00:44.719498 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d90e25a_d8b6_4a4c_9948_c8ea3b38996c.slice/crio-0d6e5a44906ee517ffd8bb4f7684ce6995193ac0998635a98fabf6abde78ac1a WatchSource:0}: Error finding container 0d6e5a44906ee517ffd8bb4f7684ce6995193ac0998635a98fabf6abde78ac1a: Status 404 returned error can't find the container with id 0d6e5a44906ee517ffd8bb4f7684ce6995193ac0998635a98fabf6abde78ac1a Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.781884 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.781955 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.781980 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.782007 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.782025 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:44Z","lastTransitionTime":"2025-10-07T19:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.884163 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.884210 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.884224 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.884285 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.884298 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:44Z","lastTransitionTime":"2025-10-07T19:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.986635 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.986672 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.986682 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.986700 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:44 crc kubenswrapper[4825]: I1007 19:00:44.986711 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:44Z","lastTransitionTime":"2025-10-07T19:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.080720 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" event={"ID":"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c","Type":"ContainerStarted","Data":"c82749148befd799cd1962c8be8688b1dd154b1481de391a25a399f2f2e640bf"} Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.080792 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" event={"ID":"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c","Type":"ContainerStarted","Data":"8c2d0969fc24a0da54619501d29224a835772e85dee07940dee63ec5554f9891"} Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.080811 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" event={"ID":"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c","Type":"ContainerStarted","Data":"0d6e5a44906ee517ffd8bb4f7684ce6995193ac0998635a98fabf6abde78ac1a"} Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.082476 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6lvdm_11546b62-cdda-449d-963e-418c2d4b6e46/ovnkube-controller/1.log" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.083034 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6lvdm_11546b62-cdda-449d-963e-418c2d4b6e46/ovnkube-controller/0.log" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.085916 4825 generic.go:334] "Generic (PLEG): container finished" podID="11546b62-cdda-449d-963e-418c2d4b6e46" containerID="b4c9c4fdf4419280e9c033097e4536865098594e3b8d25fd0918b45a8b436112" exitCode=1 Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.085962 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerDied","Data":"b4c9c4fdf4419280e9c033097e4536865098594e3b8d25fd0918b45a8b436112"} Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.086030 4825 scope.go:117] "RemoveContainer" containerID="d9383a24482e5941ddd4e710bd11a236d81e75ff818fc493f6560e0120cbba10" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.086667 4825 scope.go:117] "RemoveContainer" containerID="b4c9c4fdf4419280e9c033097e4536865098594e3b8d25fd0918b45a8b436112" Oct 07 19:00:45 crc kubenswrapper[4825]: E1007 19:00:45.086858 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-6lvdm_openshift-ovn-kubernetes(11546b62-cdda-449d-963e-418c2d4b6e46)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.088401 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.088615 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.088635 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.088658 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.088674 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:45Z","lastTransitionTime":"2025-10-07T19:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.111852 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.128153 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.146273 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.168754 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.181239 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.191284 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.191324 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.191341 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.191361 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.191373 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:45Z","lastTransitionTime":"2025-10-07T19:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.194041 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.205221 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.228415 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c9c4fdf4419280e9c033097e4536865098594e3b8d25fd0918b45a8b436112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9383a24482e5941ddd4e710bd11a236d81e75ff818fc493f6560e0120cbba10\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:00:42Z\\\",\\\"message\\\":\\\" 6063 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 19:00:42.600345 6063 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 19:00:42.600380 6063 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 19:00:42.600403 6063 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 19:00:42.600410 6063 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 19:00:42.600428 6063 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:42.600445 6063 factory.go:656] Stopping watch factory\\\\nI1007 19:00:42.600449 6063 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 19:00:42.600467 6063 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 19:00:42.600468 6063 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 19:00:42.600476 6063 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 19:00:42.600487 6063 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 19:00:42.600498 6063 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 19:00:42.600506 6063 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 19:00:42.600775 6063 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.240377 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2d0969fc24a0da54619501d29224a835772e85dee07940dee63ec5554f9891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82749148befd799cd1962c8be8688b1dd154b1481de391a25a399f2f2e640bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5c4jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.256157 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.270037 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.283305 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3c122804372bb1842a362067d274a1debd124b633605dbf43d21d52688ef96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.294206 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.294291 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.294307 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.294328 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.294339 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:45Z","lastTransitionTime":"2025-10-07T19:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.298798 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.314719 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://118b6f70278ab0bc5e10ad653b675b5790a88df552124be3fe509514c6d59a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.324738 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.337186 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.348836 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.364163 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.376632 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3c122804372bb1842a362067d274a1debd124b633605dbf43d21d52688ef96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.389869 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.396648 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.396695 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.396711 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.396730 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.396743 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:45Z","lastTransitionTime":"2025-10-07T19:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.412607 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://118b6f70278ab0bc5e10ad653b675b5790a88df552124be3fe509514c6d59a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.429844 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.454964 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.471080 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.487326 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.499965 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.500039 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.500059 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.500087 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.500107 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:45Z","lastTransitionTime":"2025-10-07T19:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.512136 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.535549 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.555582 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.569890 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.583631 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.602499 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.602578 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.602602 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.602633 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.602654 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:45Z","lastTransitionTime":"2025-10-07T19:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.602627 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c9c4fdf4419280e9c033097e4536865098594e3b8d25fd0918b45a8b436112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9383a24482e5941ddd4e710bd11a236d81e75ff818fc493f6560e0120cbba10\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:00:42Z\\\",\\\"message\\\":\\\" 6063 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 19:00:42.600345 6063 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 19:00:42.600380 6063 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 19:00:42.600403 6063 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 19:00:42.600410 6063 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 19:00:42.600428 6063 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:42.600445 6063 factory.go:656] Stopping watch factory\\\\nI1007 19:00:42.600449 6063 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 19:00:42.600467 6063 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 19:00:42.600468 6063 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 19:00:42.600476 6063 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 19:00:42.600487 6063 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 19:00:42.600498 6063 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 19:00:42.600506 6063 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 19:00:42.600775 6063 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4c9c4fdf4419280e9c033097e4536865098594e3b8d25fd0918b45a8b436112\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"message\\\":\\\"36 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 19:00:44.208422 6236 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 19:00:44.208589 6236 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.208920 6236 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.209129 6236 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 19:00:44.209218 6236 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.209484 6236 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.209738 6236 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 19:00:44.209752 6236 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 19:00:44.209787 6236 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 19:00:44.209841 6236 factory.go:656] Stopping watch factory\\\\nI1007 19:00:44.209857 6236 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.617664 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2d0969fc24a0da54619501d29224a835772e85dee07940dee63ec5554f9891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82749148befd799cd1962c8be8688b1dd154b1481de391a25a399f2f2e640bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5c4jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:45Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.705293 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.705343 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.705357 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.705377 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.705387 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:45Z","lastTransitionTime":"2025-10-07T19:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.794451 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.794585 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.794811 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:45 crc kubenswrapper[4825]: E1007 19:00:45.794802 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:00:45 crc kubenswrapper[4825]: E1007 19:00:45.794925 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:00:45 crc kubenswrapper[4825]: E1007 19:00:45.795035 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.806910 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.806953 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.806964 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.806980 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.806992 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:45Z","lastTransitionTime":"2025-10-07T19:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.910299 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.910356 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.910373 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.910399 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:45 crc kubenswrapper[4825]: I1007 19:00:45.910417 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:45Z","lastTransitionTime":"2025-10-07T19:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.008918 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.012860 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.012915 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.012933 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.012959 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.012977 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:46Z","lastTransitionTime":"2025-10-07T19:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.031493 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.050133 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.067058 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.083154 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.090795 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6lvdm_11546b62-cdda-449d-963e-418c2d4b6e46/ovnkube-controller/1.log" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.096137 4825 scope.go:117] "RemoveContainer" containerID="b4c9c4fdf4419280e9c033097e4536865098594e3b8d25fd0918b45a8b436112" Oct 07 19:00:46 crc kubenswrapper[4825]: E1007 19:00:46.096408 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-6lvdm_openshift-ovn-kubernetes(11546b62-cdda-449d-963e-418c2d4b6e46)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.116032 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.116092 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.116108 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.116130 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.116149 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:46Z","lastTransitionTime":"2025-10-07T19:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.116712 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c9c4fdf4419280e9c033097e4536865098594e3b8d25fd0918b45a8b436112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9383a24482e5941ddd4e710bd11a236d81e75ff818fc493f6560e0120cbba10\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:00:42Z\\\",\\\"message\\\":\\\" 6063 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 19:00:42.600345 6063 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 19:00:42.600380 6063 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 19:00:42.600403 6063 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 19:00:42.600410 6063 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 19:00:42.600428 6063 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:42.600445 6063 factory.go:656] Stopping watch factory\\\\nI1007 19:00:42.600449 6063 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 19:00:42.600467 6063 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 19:00:42.600468 6063 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 19:00:42.600476 6063 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 19:00:42.600487 6063 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 19:00:42.600498 6063 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 19:00:42.600506 6063 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 19:00:42.600775 6063 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4c9c4fdf4419280e9c033097e4536865098594e3b8d25fd0918b45a8b436112\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"message\\\":\\\"36 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 19:00:44.208422 6236 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 19:00:44.208589 6236 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.208920 6236 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.209129 6236 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 19:00:44.209218 6236 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.209484 6236 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.209738 6236 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 19:00:44.209752 6236 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 19:00:44.209787 6236 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 19:00:44.209841 6236 factory.go:656] Stopping watch factory\\\\nI1007 19:00:44.209857 6236 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.132999 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2d0969fc24a0da54619501d29224a835772e85dee07940dee63ec5554f9891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82749148befd799cd1962c8be8688b1dd154b1481de391a25a399f2f2e640bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5c4jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.151324 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.170629 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.183745 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3c122804372bb1842a362067d274a1debd124b633605dbf43d21d52688ef96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.197669 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.218049 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.218095 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.218103 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.218117 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.218127 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:46Z","lastTransitionTime":"2025-10-07T19:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.223656 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://118b6f70278ab0bc5e10ad653b675b5790a88df552124be3fe509514c6d59a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.248256 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.250621 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-bvwh2"] Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.251369 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:00:46 crc kubenswrapper[4825]: E1007 19:00:46.251605 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.286193 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.307709 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.321692 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.321773 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.321802 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.321866 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.321897 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:46Z","lastTransitionTime":"2025-10-07T19:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.331105 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.333780 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs\") pod \"network-metrics-daemon-bvwh2\" (UID: \"ee9b984f-baa3-429f-b929-3d61d5e204bc\") " pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.333943 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97bxd\" (UniqueName: \"kubernetes.io/projected/ee9b984f-baa3-429f-b929-3d61d5e204bc-kube-api-access-97bxd\") pod \"network-metrics-daemon-bvwh2\" (UID: \"ee9b984f-baa3-429f-b929-3d61d5e204bc\") " pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.349899 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.384361 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.404508 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.422707 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.424680 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.424755 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.424774 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.424799 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.424817 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:46Z","lastTransitionTime":"2025-10-07T19:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.434917 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs\") pod \"network-metrics-daemon-bvwh2\" (UID: \"ee9b984f-baa3-429f-b929-3d61d5e204bc\") " pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.435036 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97bxd\" (UniqueName: \"kubernetes.io/projected/ee9b984f-baa3-429f-b929-3d61d5e204bc-kube-api-access-97bxd\") pod \"network-metrics-daemon-bvwh2\" (UID: \"ee9b984f-baa3-429f-b929-3d61d5e204bc\") " pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:00:46 crc kubenswrapper[4825]: E1007 19:00:46.435182 4825 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 19:00:46 crc kubenswrapper[4825]: E1007 19:00:46.435320 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs podName:ee9b984f-baa3-429f-b929-3d61d5e204bc nodeName:}" failed. No retries permitted until 2025-10-07 19:00:46.935293875 +0000 UTC m=+35.757332552 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs") pod "network-metrics-daemon-bvwh2" (UID: "ee9b984f-baa3-429f-b929-3d61d5e204bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.444993 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.459778 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97bxd\" (UniqueName: \"kubernetes.io/projected/ee9b984f-baa3-429f-b929-3d61d5e204bc-kube-api-access-97bxd\") pod \"network-metrics-daemon-bvwh2\" (UID: \"ee9b984f-baa3-429f-b929-3d61d5e204bc\") " pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.466646 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.485715 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.501732 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.528105 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.528195 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.528216 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.528263 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.528281 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:46Z","lastTransitionTime":"2025-10-07T19:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.530128 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c9c4fdf4419280e9c033097e4536865098594e3b8d25fd0918b45a8b436112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4c9c4fdf4419280e9c033097e4536865098594e3b8d25fd0918b45a8b436112\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"message\\\":\\\"36 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 19:00:44.208422 6236 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 19:00:44.208589 6236 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.208920 6236 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.209129 6236 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 19:00:44.209218 6236 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.209484 6236 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.209738 6236 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 19:00:44.209752 6236 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 19:00:44.209787 6236 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 19:00:44.209841 6236 factory.go:656] Stopping watch factory\\\\nI1007 19:00:44.209857 6236 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-6lvdm_openshift-ovn-kubernetes(11546b62-cdda-449d-963e-418c2d4b6e46)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.548684 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2d0969fc24a0da54619501d29224a835772e85dee07940dee63ec5554f9891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82749148befd799cd1962c8be8688b1dd154b1481de391a25a399f2f2e640bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5c4jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.570913 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.589137 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.605835 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3c122804372bb1842a362067d274a1debd124b633605dbf43d21d52688ef96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.622701 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.633088 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.633164 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.633181 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.633206 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.633254 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:46Z","lastTransitionTime":"2025-10-07T19:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.649328 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://118b6f70278ab0bc5e10ad653b675b5790a88df552124be3fe509514c6d59a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.662857 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.675641 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bvwh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee9b984f-baa3-429f-b929-3d61d5e204bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bvwh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.688077 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:46Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.736774 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.736814 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.736826 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.736843 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.736857 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:46Z","lastTransitionTime":"2025-10-07T19:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.839314 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.839360 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.839372 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.839390 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.839402 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:46Z","lastTransitionTime":"2025-10-07T19:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.938898 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs\") pod \"network-metrics-daemon-bvwh2\" (UID: \"ee9b984f-baa3-429f-b929-3d61d5e204bc\") " pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:00:46 crc kubenswrapper[4825]: E1007 19:00:46.939076 4825 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 19:00:46 crc kubenswrapper[4825]: E1007 19:00:46.939172 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs podName:ee9b984f-baa3-429f-b929-3d61d5e204bc nodeName:}" failed. No retries permitted until 2025-10-07 19:00:47.939147211 +0000 UTC m=+36.761185888 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs") pod "network-metrics-daemon-bvwh2" (UID: "ee9b984f-baa3-429f-b929-3d61d5e204bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.941846 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.941908 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.941925 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.941947 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:46 crc kubenswrapper[4825]: I1007 19:00:46.941962 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:46Z","lastTransitionTime":"2025-10-07T19:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.045543 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.045603 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.045620 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.045646 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.045664 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:47Z","lastTransitionTime":"2025-10-07T19:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.149465 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.149534 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.149552 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.149579 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.149600 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:47Z","lastTransitionTime":"2025-10-07T19:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.226771 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.228112 4825 scope.go:117] "RemoveContainer" containerID="b4c9c4fdf4419280e9c033097e4536865098594e3b8d25fd0918b45a8b436112" Oct 07 19:00:47 crc kubenswrapper[4825]: E1007 19:00:47.228425 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-6lvdm_openshift-ovn-kubernetes(11546b62-cdda-449d-963e-418c2d4b6e46)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.252381 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.252443 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.252459 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.252479 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.252490 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:47Z","lastTransitionTime":"2025-10-07T19:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.355529 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.355576 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.355589 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.355608 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.355623 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:47Z","lastTransitionTime":"2025-10-07T19:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.458654 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.458727 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.458753 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.458783 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.458801 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:47Z","lastTransitionTime":"2025-10-07T19:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.562379 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.562455 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.562478 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.562508 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.562531 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:47Z","lastTransitionTime":"2025-10-07T19:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.647580 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.647751 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.647866 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:47 crc kubenswrapper[4825]: E1007 19:00:47.647912 4825 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 19:00:47 crc kubenswrapper[4825]: E1007 19:00:47.647920 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:01:03.647880307 +0000 UTC m=+52.469918974 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:00:47 crc kubenswrapper[4825]: E1007 19:00:47.647980 4825 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 19:00:47 crc kubenswrapper[4825]: E1007 19:00:47.648004 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 19:01:03.64798005 +0000 UTC m=+52.470018717 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 19:00:47 crc kubenswrapper[4825]: E1007 19:00:47.648057 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 19:01:03.648031952 +0000 UTC m=+52.470070699 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.666160 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.666258 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.666285 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.666316 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.666343 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:47Z","lastTransitionTime":"2025-10-07T19:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.748661 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.748783 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:00:47 crc kubenswrapper[4825]: E1007 19:00:47.748946 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 19:00:47 crc kubenswrapper[4825]: E1007 19:00:47.748996 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 19:00:47 crc kubenswrapper[4825]: E1007 19:00:47.749000 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 19:00:47 crc kubenswrapper[4825]: E1007 19:00:47.749028 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 19:00:47 crc kubenswrapper[4825]: E1007 19:00:47.749039 4825 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:00:47 crc kubenswrapper[4825]: E1007 19:00:47.749051 4825 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:00:47 crc kubenswrapper[4825]: E1007 19:00:47.749124 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 19:01:03.749103488 +0000 UTC m=+52.571142155 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:00:47 crc kubenswrapper[4825]: E1007 19:00:47.749157 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 19:01:03.74914642 +0000 UTC m=+52.571185087 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.769467 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.769526 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.769546 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.769570 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.769587 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:47Z","lastTransitionTime":"2025-10-07T19:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.794807 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.794853 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.794989 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:00:47 crc kubenswrapper[4825]: E1007 19:00:47.794977 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.795040 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:00:47 crc kubenswrapper[4825]: E1007 19:00:47.795158 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:00:47 crc kubenswrapper[4825]: E1007 19:00:47.795346 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:00:47 crc kubenswrapper[4825]: E1007 19:00:47.795587 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.873278 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.873348 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.873366 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.873393 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.873412 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:47Z","lastTransitionTime":"2025-10-07T19:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.951458 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs\") pod \"network-metrics-daemon-bvwh2\" (UID: \"ee9b984f-baa3-429f-b929-3d61d5e204bc\") " pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:00:47 crc kubenswrapper[4825]: E1007 19:00:47.951655 4825 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 19:00:47 crc kubenswrapper[4825]: E1007 19:00:47.952042 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs podName:ee9b984f-baa3-429f-b929-3d61d5e204bc nodeName:}" failed. No retries permitted until 2025-10-07 19:00:49.951997195 +0000 UTC m=+38.774035842 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs") pod "network-metrics-daemon-bvwh2" (UID: "ee9b984f-baa3-429f-b929-3d61d5e204bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.976131 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.976183 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.976195 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.976213 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:47 crc kubenswrapper[4825]: I1007 19:00:47.976255 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:47Z","lastTransitionTime":"2025-10-07T19:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.079658 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.079698 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.079706 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.079723 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.079738 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:48Z","lastTransitionTime":"2025-10-07T19:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.152586 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.152643 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.152660 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.152687 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.152704 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:48Z","lastTransitionTime":"2025-10-07T19:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:48 crc kubenswrapper[4825]: E1007 19:00:48.169629 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:48Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.174503 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.174553 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.174570 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.174594 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.174613 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:48Z","lastTransitionTime":"2025-10-07T19:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:48 crc kubenswrapper[4825]: E1007 19:00:48.193688 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:48Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.205088 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.205162 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.205181 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.205206 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.205224 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:48Z","lastTransitionTime":"2025-10-07T19:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:48 crc kubenswrapper[4825]: E1007 19:00:48.223022 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:48Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.229318 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.229400 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.229423 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.229456 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.229479 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:48Z","lastTransitionTime":"2025-10-07T19:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:48 crc kubenswrapper[4825]: E1007 19:00:48.244759 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:48Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.250378 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.250453 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.250477 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.250510 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.250534 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:48Z","lastTransitionTime":"2025-10-07T19:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:48 crc kubenswrapper[4825]: E1007 19:00:48.271126 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:48Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:48 crc kubenswrapper[4825]: E1007 19:00:48.271392 4825 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.273981 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.274049 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.274073 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.274108 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.274134 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:48Z","lastTransitionTime":"2025-10-07T19:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.376981 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.377083 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.377106 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.377136 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.377159 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:48Z","lastTransitionTime":"2025-10-07T19:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.480406 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.480485 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.480510 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.480540 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.480561 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:48Z","lastTransitionTime":"2025-10-07T19:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.583751 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.583877 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.583900 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.583927 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.583946 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:48Z","lastTransitionTime":"2025-10-07T19:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.687165 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.687220 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.687252 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.687270 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.687281 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:48Z","lastTransitionTime":"2025-10-07T19:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.790215 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.790316 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.790334 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.790361 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.790377 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:48Z","lastTransitionTime":"2025-10-07T19:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.893449 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.893507 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.893526 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.893549 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.893567 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:48Z","lastTransitionTime":"2025-10-07T19:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.996612 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.996660 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.996677 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.996696 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:48 crc kubenswrapper[4825]: I1007 19:00:48.996709 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:48Z","lastTransitionTime":"2025-10-07T19:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.099881 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.099952 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.099970 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.099996 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.100015 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:49Z","lastTransitionTime":"2025-10-07T19:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.203618 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.203679 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.203696 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.203725 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.203749 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:49Z","lastTransitionTime":"2025-10-07T19:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.307070 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.307125 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.307135 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.307161 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.307175 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:49Z","lastTransitionTime":"2025-10-07T19:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.409393 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.409439 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.409449 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.409465 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.409477 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:49Z","lastTransitionTime":"2025-10-07T19:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.512555 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.512611 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.512631 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.512658 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.512675 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:49Z","lastTransitionTime":"2025-10-07T19:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.615931 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.616006 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.616025 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.616050 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.616100 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:49Z","lastTransitionTime":"2025-10-07T19:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.718655 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.718691 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.718699 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.718714 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.718725 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:49Z","lastTransitionTime":"2025-10-07T19:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.794654 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.794724 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.794724 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:00:49 crc kubenswrapper[4825]: E1007 19:00:49.794841 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.794881 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:49 crc kubenswrapper[4825]: E1007 19:00:49.795055 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:00:49 crc kubenswrapper[4825]: E1007 19:00:49.795367 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:00:49 crc kubenswrapper[4825]: E1007 19:00:49.795519 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.823185 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.823255 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.823271 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.823290 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.823310 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:49Z","lastTransitionTime":"2025-10-07T19:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.927024 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.927091 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.927110 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.927139 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.927157 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:49Z","lastTransitionTime":"2025-10-07T19:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:49 crc kubenswrapper[4825]: I1007 19:00:49.971960 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs\") pod \"network-metrics-daemon-bvwh2\" (UID: \"ee9b984f-baa3-429f-b929-3d61d5e204bc\") " pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:00:49 crc kubenswrapper[4825]: E1007 19:00:49.972205 4825 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 19:00:49 crc kubenswrapper[4825]: E1007 19:00:49.972390 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs podName:ee9b984f-baa3-429f-b929-3d61d5e204bc nodeName:}" failed. No retries permitted until 2025-10-07 19:00:53.972350335 +0000 UTC m=+42.794389132 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs") pod "network-metrics-daemon-bvwh2" (UID: "ee9b984f-baa3-429f-b929-3d61d5e204bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.031270 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.031389 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.031414 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.031446 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.031471 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:50Z","lastTransitionTime":"2025-10-07T19:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.134932 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.134992 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.135009 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.135037 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.135055 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:50Z","lastTransitionTime":"2025-10-07T19:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.238341 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.238387 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.238410 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.238427 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.238438 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:50Z","lastTransitionTime":"2025-10-07T19:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.341430 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.341490 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.341507 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.341531 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.341548 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:50Z","lastTransitionTime":"2025-10-07T19:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.445139 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.445192 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.445204 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.445240 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.445255 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:50Z","lastTransitionTime":"2025-10-07T19:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.547891 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.547963 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.547987 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.548019 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.548042 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:50Z","lastTransitionTime":"2025-10-07T19:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.651444 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.651523 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.651541 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.651567 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.651587 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:50Z","lastTransitionTime":"2025-10-07T19:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.754930 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.754990 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.755009 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.755033 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.755050 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:50Z","lastTransitionTime":"2025-10-07T19:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.858457 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.858520 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.858536 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.858559 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.858599 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:50Z","lastTransitionTime":"2025-10-07T19:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.962484 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.962530 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.962544 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.962562 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:50 crc kubenswrapper[4825]: I1007 19:00:50.962574 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:50Z","lastTransitionTime":"2025-10-07T19:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.065268 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.065358 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.065397 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.065433 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.065456 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:51Z","lastTransitionTime":"2025-10-07T19:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.168584 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.168651 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.168670 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.168699 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.168721 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:51Z","lastTransitionTime":"2025-10-07T19:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.271876 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.271939 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.271962 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.271986 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.272005 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:51Z","lastTransitionTime":"2025-10-07T19:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.374538 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.374583 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.374593 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.374610 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.374623 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:51Z","lastTransitionTime":"2025-10-07T19:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.478521 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.478596 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.478615 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.478642 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.478662 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:51Z","lastTransitionTime":"2025-10-07T19:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.581107 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.581167 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.581180 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.581201 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.581215 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:51Z","lastTransitionTime":"2025-10-07T19:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.684444 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.684502 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.684513 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.684536 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.684549 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:51Z","lastTransitionTime":"2025-10-07T19:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.787536 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.787601 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.787612 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.787633 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.787648 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:51Z","lastTransitionTime":"2025-10-07T19:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.794977 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.795205 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.795127 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.795413 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:00:51 crc kubenswrapper[4825]: E1007 19:00:51.795521 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:00:51 crc kubenswrapper[4825]: E1007 19:00:51.795667 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:00:51 crc kubenswrapper[4825]: E1007 19:00:51.795888 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:00:51 crc kubenswrapper[4825]: E1007 19:00:51.796117 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.814400 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:51Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.836746 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://118b6f70278ab0bc5e10ad653b675b5790a88df552124be3fe509514c6d59a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:51Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.852456 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:51Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.865382 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bvwh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee9b984f-baa3-429f-b929-3d61d5e204bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bvwh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:51Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.881536 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:51Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.893269 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.893336 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.893349 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.893375 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.893389 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:51Z","lastTransitionTime":"2025-10-07T19:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.920812 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:51Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.938978 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:51Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.959972 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:51Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.975715 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2d0969fc24a0da54619501d29224a835772e85dee07940dee63ec5554f9891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82749148befd799cd1962c8be8688b1dd154b1481de391a25a399f2f2e640bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5c4jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:51Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.994125 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:51Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.996006 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.996109 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.996130 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.996158 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:51 crc kubenswrapper[4825]: I1007 19:00:51.996208 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:51Z","lastTransitionTime":"2025-10-07T19:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.023176 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:52Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.045177 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:52Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.063504 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:52Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.095905 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c9c4fdf4419280e9c033097e4536865098594e3b8d25fd0918b45a8b436112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4c9c4fdf4419280e9c033097e4536865098594e3b8d25fd0918b45a8b436112\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"message\\\":\\\"36 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 19:00:44.208422 6236 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 19:00:44.208589 6236 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.208920 6236 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.209129 6236 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 19:00:44.209218 6236 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.209484 6236 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.209738 6236 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 19:00:44.209752 6236 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 19:00:44.209787 6236 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 19:00:44.209841 6236 factory.go:656] Stopping watch factory\\\\nI1007 19:00:44.209857 6236 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-6lvdm_openshift-ovn-kubernetes(11546b62-cdda-449d-963e-418c2d4b6e46)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:52Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.099271 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.099337 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.099356 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.099382 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.099402 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:52Z","lastTransitionTime":"2025-10-07T19:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.120027 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:52Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.141948 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:52Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.159714 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3c122804372bb1842a362067d274a1debd124b633605dbf43d21d52688ef96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:52Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.201689 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.201769 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.201792 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.201823 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.201851 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:52Z","lastTransitionTime":"2025-10-07T19:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.305047 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.305107 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.305120 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.305139 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.305470 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:52Z","lastTransitionTime":"2025-10-07T19:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.408036 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.408102 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.408118 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.408139 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.408153 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:52Z","lastTransitionTime":"2025-10-07T19:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.511311 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.511357 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.511369 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.511386 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.511402 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:52Z","lastTransitionTime":"2025-10-07T19:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.614379 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.614427 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.614440 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.614461 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.614475 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:52Z","lastTransitionTime":"2025-10-07T19:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.717501 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.717571 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.717589 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.717617 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.717639 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:52Z","lastTransitionTime":"2025-10-07T19:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.824180 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.824513 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.824601 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.824629 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.824649 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:52Z","lastTransitionTime":"2025-10-07T19:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.927750 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.927784 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.927793 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.927807 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:52 crc kubenswrapper[4825]: I1007 19:00:52.927816 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:52Z","lastTransitionTime":"2025-10-07T19:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.031666 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.031752 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.031804 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.031838 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.031860 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:53Z","lastTransitionTime":"2025-10-07T19:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.134821 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.134884 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.134909 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.134943 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.134970 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:53Z","lastTransitionTime":"2025-10-07T19:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.238315 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.238367 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.238383 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.238407 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.238425 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:53Z","lastTransitionTime":"2025-10-07T19:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.341669 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.341735 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.341754 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.341780 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.341800 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:53Z","lastTransitionTime":"2025-10-07T19:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.445168 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.445271 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.445290 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.445315 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.445332 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:53Z","lastTransitionTime":"2025-10-07T19:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.548460 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.548528 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.548554 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.548580 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.548600 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:53Z","lastTransitionTime":"2025-10-07T19:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.652041 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.652105 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.652126 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.652156 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.652176 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:53Z","lastTransitionTime":"2025-10-07T19:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.755314 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.755367 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.755384 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.755410 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.755428 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:53Z","lastTransitionTime":"2025-10-07T19:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.794915 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.794973 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.795100 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:00:53 crc kubenswrapper[4825]: E1007 19:00:53.795304 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.795773 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:00:53 crc kubenswrapper[4825]: E1007 19:00:53.795959 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:00:53 crc kubenswrapper[4825]: E1007 19:00:53.796060 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:00:53 crc kubenswrapper[4825]: E1007 19:00:53.796185 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.858752 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.858809 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.858826 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.858849 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.858867 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:53Z","lastTransitionTime":"2025-10-07T19:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.961461 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.961522 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.961539 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.961561 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:53 crc kubenswrapper[4825]: I1007 19:00:53.961579 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:53Z","lastTransitionTime":"2025-10-07T19:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.022489 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs\") pod \"network-metrics-daemon-bvwh2\" (UID: \"ee9b984f-baa3-429f-b929-3d61d5e204bc\") " pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:00:54 crc kubenswrapper[4825]: E1007 19:00:54.022775 4825 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 19:00:54 crc kubenswrapper[4825]: E1007 19:00:54.022918 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs podName:ee9b984f-baa3-429f-b929-3d61d5e204bc nodeName:}" failed. No retries permitted until 2025-10-07 19:01:02.022884948 +0000 UTC m=+50.844923625 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs") pod "network-metrics-daemon-bvwh2" (UID: "ee9b984f-baa3-429f-b929-3d61d5e204bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.064448 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.064511 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.064530 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.064555 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.064572 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:54Z","lastTransitionTime":"2025-10-07T19:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.167349 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.167400 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.167418 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.167441 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.167458 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:54Z","lastTransitionTime":"2025-10-07T19:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.270388 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.270538 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.270566 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.270596 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.270615 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:54Z","lastTransitionTime":"2025-10-07T19:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.373888 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.373995 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.374020 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.374042 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.374054 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:54Z","lastTransitionTime":"2025-10-07T19:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.477156 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.477208 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.477272 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.477304 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.477321 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:54Z","lastTransitionTime":"2025-10-07T19:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.580731 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.580806 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.580823 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.580854 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.580873 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:54Z","lastTransitionTime":"2025-10-07T19:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.684441 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.684510 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.684532 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.684562 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.684584 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:54Z","lastTransitionTime":"2025-10-07T19:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.788047 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.788109 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.788134 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.788163 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.788185 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:54Z","lastTransitionTime":"2025-10-07T19:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.891485 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.891547 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.891560 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.891584 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.891602 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:54Z","lastTransitionTime":"2025-10-07T19:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.995176 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.995306 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.995333 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.995364 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:54 crc kubenswrapper[4825]: I1007 19:00:54.995388 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:54Z","lastTransitionTime":"2025-10-07T19:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.098125 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.098191 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.098213 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.098266 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.098286 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:55Z","lastTransitionTime":"2025-10-07T19:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.205903 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.205986 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.206009 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.206040 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.206072 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:55Z","lastTransitionTime":"2025-10-07T19:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.310069 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.310449 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.310695 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.310880 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.311016 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:55Z","lastTransitionTime":"2025-10-07T19:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.414992 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.415081 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.415092 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.415113 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.415127 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:55Z","lastTransitionTime":"2025-10-07T19:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.517898 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.517961 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.517980 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.518010 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.518028 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:55Z","lastTransitionTime":"2025-10-07T19:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.621619 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.622078 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.622328 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.622523 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.622694 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:55Z","lastTransitionTime":"2025-10-07T19:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.726276 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.726351 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.726368 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.726394 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.726413 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:55Z","lastTransitionTime":"2025-10-07T19:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.794710 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.794748 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.794881 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:00:55 crc kubenswrapper[4825]: E1007 19:00:55.795104 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.795150 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:00:55 crc kubenswrapper[4825]: E1007 19:00:55.795313 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:00:55 crc kubenswrapper[4825]: E1007 19:00:55.795562 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:00:55 crc kubenswrapper[4825]: E1007 19:00:55.795812 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.829558 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.829626 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.829651 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.829685 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.829724 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:55Z","lastTransitionTime":"2025-10-07T19:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.933701 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.933789 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.933818 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.933856 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:55 crc kubenswrapper[4825]: I1007 19:00:55.933976 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:55Z","lastTransitionTime":"2025-10-07T19:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.037611 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.037702 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.037723 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.037752 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.037774 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:56Z","lastTransitionTime":"2025-10-07T19:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.140838 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.140884 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.140892 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.140905 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.140913 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:56Z","lastTransitionTime":"2025-10-07T19:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.243796 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.243837 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.243844 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.243860 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.243869 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:56Z","lastTransitionTime":"2025-10-07T19:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.345929 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.346096 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.346176 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.346205 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.346252 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:56Z","lastTransitionTime":"2025-10-07T19:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.449177 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.449266 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.449289 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.449359 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.449382 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:56Z","lastTransitionTime":"2025-10-07T19:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.551985 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.552065 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.552085 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.552113 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.552131 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:56Z","lastTransitionTime":"2025-10-07T19:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.655510 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.655567 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.655583 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.655606 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.655623 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:56Z","lastTransitionTime":"2025-10-07T19:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.758789 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.758908 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.758933 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.758988 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.759015 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:56Z","lastTransitionTime":"2025-10-07T19:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.862341 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.862414 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.862437 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.862468 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.862497 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:56Z","lastTransitionTime":"2025-10-07T19:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.965889 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.965990 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.966012 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.966035 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:56 crc kubenswrapper[4825]: I1007 19:00:56.966054 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:56Z","lastTransitionTime":"2025-10-07T19:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.068909 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.068969 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.068985 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.069009 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.069027 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:57Z","lastTransitionTime":"2025-10-07T19:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.172628 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.172683 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.172699 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.172724 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.172741 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:57Z","lastTransitionTime":"2025-10-07T19:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.276470 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.276544 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.276562 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.276586 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.276605 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:57Z","lastTransitionTime":"2025-10-07T19:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.379817 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.379887 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.379910 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.379940 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.379968 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:57Z","lastTransitionTime":"2025-10-07T19:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.483368 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.483452 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.483479 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.483510 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.483528 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:57Z","lastTransitionTime":"2025-10-07T19:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.590152 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.590953 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.591023 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.591062 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.591082 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:57Z","lastTransitionTime":"2025-10-07T19:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.693816 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.693883 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.693902 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.693928 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.693946 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:57Z","lastTransitionTime":"2025-10-07T19:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.795492 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.795533 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:00:57 crc kubenswrapper[4825]: E1007 19:00:57.795696 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.795724 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.795874 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:00:57 crc kubenswrapper[4825]: E1007 19:00:57.796122 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:00:57 crc kubenswrapper[4825]: E1007 19:00:57.796198 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:00:57 crc kubenswrapper[4825]: E1007 19:00:57.796407 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.797050 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.797076 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.797086 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.797106 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.797117 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:57Z","lastTransitionTime":"2025-10-07T19:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.899970 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.900028 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.900043 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.900062 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:57 crc kubenswrapper[4825]: I1007 19:00:57.900078 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:57Z","lastTransitionTime":"2025-10-07T19:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.003493 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.003564 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.003649 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.003702 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.003725 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:58Z","lastTransitionTime":"2025-10-07T19:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.106441 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.106613 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.106645 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.106677 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.106702 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:58Z","lastTransitionTime":"2025-10-07T19:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.209347 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.209420 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.209438 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.209464 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.209481 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:58Z","lastTransitionTime":"2025-10-07T19:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.312646 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.312708 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.312726 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.312756 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.312775 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:58Z","lastTransitionTime":"2025-10-07T19:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.416753 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.416818 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.416849 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.416875 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.416893 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:58Z","lastTransitionTime":"2025-10-07T19:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.521110 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.521179 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.521200 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.521258 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.521276 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:58Z","lastTransitionTime":"2025-10-07T19:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.596632 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.596692 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.596711 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.596735 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.596752 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:58Z","lastTransitionTime":"2025-10-07T19:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:58 crc kubenswrapper[4825]: E1007 19:00:58.620063 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:58Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.625460 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.625510 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.625525 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.625548 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.625564 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:58Z","lastTransitionTime":"2025-10-07T19:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:58 crc kubenswrapper[4825]: E1007 19:00:58.642289 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:58Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.647457 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.647513 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.647524 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.647544 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.647557 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:58Z","lastTransitionTime":"2025-10-07T19:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:58 crc kubenswrapper[4825]: E1007 19:00:58.667398 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:58Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.672472 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.672516 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.672527 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.672545 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.672557 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:58Z","lastTransitionTime":"2025-10-07T19:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:58 crc kubenswrapper[4825]: E1007 19:00:58.686092 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:58Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.690961 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.691022 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.691044 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.691074 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.691096 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:58Z","lastTransitionTime":"2025-10-07T19:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:58 crc kubenswrapper[4825]: E1007 19:00:58.711644 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:00:58Z is after 2025-08-24T17:21:41Z" Oct 07 19:00:58 crc kubenswrapper[4825]: E1007 19:00:58.711893 4825 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.713999 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.714071 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.714089 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.714117 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.714135 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:58Z","lastTransitionTime":"2025-10-07T19:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.816555 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.816598 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.816610 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.816628 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.816640 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:58Z","lastTransitionTime":"2025-10-07T19:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.919729 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.919801 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.919825 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.919860 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:58 crc kubenswrapper[4825]: I1007 19:00:58.919889 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:58Z","lastTransitionTime":"2025-10-07T19:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.023283 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.023347 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.023366 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.023389 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.023406 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:59Z","lastTransitionTime":"2025-10-07T19:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.127923 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.127993 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.128012 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.128042 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.128065 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:59Z","lastTransitionTime":"2025-10-07T19:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.231779 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.231840 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.231851 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.231870 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.231881 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:59Z","lastTransitionTime":"2025-10-07T19:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.335460 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.335553 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.335579 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.335628 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.335658 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:59Z","lastTransitionTime":"2025-10-07T19:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.438833 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.438893 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.438911 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.438936 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.438955 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:59Z","lastTransitionTime":"2025-10-07T19:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.542533 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.542576 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.542588 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.542607 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.542618 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:59Z","lastTransitionTime":"2025-10-07T19:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.645627 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.645693 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.645710 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.645739 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.645759 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:59Z","lastTransitionTime":"2025-10-07T19:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.748965 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.749046 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.749068 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.749096 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.749114 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:59Z","lastTransitionTime":"2025-10-07T19:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.794970 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.795014 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.795124 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:00:59 crc kubenswrapper[4825]: E1007 19:00:59.795330 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.795387 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:00:59 crc kubenswrapper[4825]: E1007 19:00:59.795562 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:00:59 crc kubenswrapper[4825]: E1007 19:00:59.795670 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:00:59 crc kubenswrapper[4825]: E1007 19:00:59.795772 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.853134 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.853221 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.853282 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.853307 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.853324 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:59Z","lastTransitionTime":"2025-10-07T19:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.957386 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.957514 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.957698 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.957893 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:00:59 crc kubenswrapper[4825]: I1007 19:00:59.957985 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:00:59Z","lastTransitionTime":"2025-10-07T19:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.061687 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.061776 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.061799 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.061828 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.061909 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:00Z","lastTransitionTime":"2025-10-07T19:01:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.165372 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.165438 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.165456 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.165482 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.165499 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:00Z","lastTransitionTime":"2025-10-07T19:01:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.268276 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.268330 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.268347 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.268368 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.268384 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:00Z","lastTransitionTime":"2025-10-07T19:01:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.371724 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.371803 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.371817 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.371837 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.371854 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:00Z","lastTransitionTime":"2025-10-07T19:01:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.475500 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.475595 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.475614 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.475645 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.475663 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:00Z","lastTransitionTime":"2025-10-07T19:01:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.578699 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.578766 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.578786 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.578812 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.578829 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:00Z","lastTransitionTime":"2025-10-07T19:01:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.682543 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.682597 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.682615 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.682640 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.682657 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:00Z","lastTransitionTime":"2025-10-07T19:01:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.785716 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.785790 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.785845 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.785876 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.785893 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:00Z","lastTransitionTime":"2025-10-07T19:01:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.796290 4825 scope.go:117] "RemoveContainer" containerID="b4c9c4fdf4419280e9c033097e4536865098594e3b8d25fd0918b45a8b436112" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.888889 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.888998 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.889028 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.889064 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.889090 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:00Z","lastTransitionTime":"2025-10-07T19:01:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.993449 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.993513 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.993534 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.993563 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:00 crc kubenswrapper[4825]: I1007 19:01:00.993583 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:00Z","lastTransitionTime":"2025-10-07T19:01:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.095965 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.096011 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.096044 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.096063 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.096076 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:01Z","lastTransitionTime":"2025-10-07T19:01:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.153583 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6lvdm_11546b62-cdda-449d-963e-418c2d4b6e46/ovnkube-controller/1.log" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.158476 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerStarted","Data":"80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f"} Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.159757 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.179020 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.200738 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.200796 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.200811 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.200833 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.200850 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:01Z","lastTransitionTime":"2025-10-07T19:01:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.202151 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.236672 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.255879 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.278662 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.304187 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.304347 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.304378 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.304414 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.304441 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:01Z","lastTransitionTime":"2025-10-07T19:01:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.314057 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4c9c4fdf4419280e9c033097e4536865098594e3b8d25fd0918b45a8b436112\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"message\\\":\\\"36 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 19:00:44.208422 6236 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 19:00:44.208589 6236 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.208920 6236 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.209129 6236 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 19:00:44.209218 6236 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.209484 6236 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.209738 6236 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 19:00:44.209752 6236 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 19:00:44.209787 6236 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 19:00:44.209841 6236 factory.go:656] Stopping watch factory\\\\nI1007 19:00:44.209857 6236 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:01:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.337504 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2d0969fc24a0da54619501d29224a835772e85dee07940dee63ec5554f9891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82749148befd799cd1962c8be8688b1dd154b1481de391a25a399f2f2e640bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5c4jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.365623 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.383615 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.407599 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.407681 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.407763 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.407786 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.407816 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.407836 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:01Z","lastTransitionTime":"2025-10-07T19:01:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.448187 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.483357 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.495869 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3c122804372bb1842a362067d274a1debd124b633605dbf43d21d52688ef96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.508614 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bvwh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee9b984f-baa3-429f-b929-3d61d5e204bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bvwh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.509643 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.509668 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.509682 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.509700 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.509711 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:01Z","lastTransitionTime":"2025-10-07T19:01:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.524235 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.552359 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://118b6f70278ab0bc5e10ad653b675b5790a88df552124be3fe509514c6d59a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.565158 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.611979 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.612035 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.612049 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.612067 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.612080 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:01Z","lastTransitionTime":"2025-10-07T19:01:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.714680 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.714723 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.714735 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.714752 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.714765 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:01Z","lastTransitionTime":"2025-10-07T19:01:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.794842 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.794954 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.795010 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:01 crc kubenswrapper[4825]: E1007 19:01:01.795076 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.795113 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:01 crc kubenswrapper[4825]: E1007 19:01:01.795195 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:01 crc kubenswrapper[4825]: E1007 19:01:01.795304 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:01 crc kubenswrapper[4825]: E1007 19:01:01.795374 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.809150 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.816795 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.816868 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.816881 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.816903 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.816914 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:01Z","lastTransitionTime":"2025-10-07T19:01:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.821133 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.843107 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.857340 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.879008 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4c9c4fdf4419280e9c033097e4536865098594e3b8d25fd0918b45a8b436112\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"message\\\":\\\"36 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 19:00:44.208422 6236 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 19:00:44.208589 6236 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.208920 6236 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.209129 6236 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 19:00:44.209218 6236 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.209484 6236 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.209738 6236 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 19:00:44.209752 6236 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 19:00:44.209787 6236 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 19:00:44.209841 6236 factory.go:656] Stopping watch factory\\\\nI1007 19:00:44.209857 6236 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:01:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.892773 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2d0969fc24a0da54619501d29224a835772e85dee07940dee63ec5554f9891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82749148befd799cd1962c8be8688b1dd154b1481de391a25a399f2f2e640bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5c4jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.909944 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.919432 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.919479 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.919493 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.919513 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.919527 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:01Z","lastTransitionTime":"2025-10-07T19:01:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.924742 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.940173 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.953915 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.967299 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.980126 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.988797 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3c122804372bb1842a362067d274a1debd124b633605dbf43d21d52688ef96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:01 crc kubenswrapper[4825]: I1007 19:01:01.996184 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.007594 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://118b6f70278ab0bc5e10ad653b675b5790a88df552124be3fe509514c6d59a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.016761 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.023012 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.023053 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.023063 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.023079 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.023090 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:02Z","lastTransitionTime":"2025-10-07T19:01:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.027441 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bvwh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee9b984f-baa3-429f-b929-3d61d5e204bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bvwh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.111907 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs\") pod \"network-metrics-daemon-bvwh2\" (UID: \"ee9b984f-baa3-429f-b929-3d61d5e204bc\") " pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:02 crc kubenswrapper[4825]: E1007 19:01:02.112134 4825 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 19:01:02 crc kubenswrapper[4825]: E1007 19:01:02.112527 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs podName:ee9b984f-baa3-429f-b929-3d61d5e204bc nodeName:}" failed. No retries permitted until 2025-10-07 19:01:18.112500937 +0000 UTC m=+66.934539614 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs") pod "network-metrics-daemon-bvwh2" (UID: "ee9b984f-baa3-429f-b929-3d61d5e204bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.125650 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.125718 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.125742 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.125777 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.125799 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:02Z","lastTransitionTime":"2025-10-07T19:01:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.166008 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6lvdm_11546b62-cdda-449d-963e-418c2d4b6e46/ovnkube-controller/2.log" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.167092 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6lvdm_11546b62-cdda-449d-963e-418c2d4b6e46/ovnkube-controller/1.log" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.171821 4825 generic.go:334] "Generic (PLEG): container finished" podID="11546b62-cdda-449d-963e-418c2d4b6e46" containerID="80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f" exitCode=1 Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.172012 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerDied","Data":"80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f"} Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.172177 4825 scope.go:117] "RemoveContainer" containerID="b4c9c4fdf4419280e9c033097e4536865098594e3b8d25fd0918b45a8b436112" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.173589 4825 scope.go:117] "RemoveContainer" containerID="80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f" Oct 07 19:01:02 crc kubenswrapper[4825]: E1007 19:01:02.173972 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6lvdm_openshift-ovn-kubernetes(11546b62-cdda-449d-963e-418c2d4b6e46)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.193990 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.215317 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://118b6f70278ab0bc5e10ad653b675b5790a88df552124be3fe509514c6d59a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.229273 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.229355 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.229380 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.229412 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.229434 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:02Z","lastTransitionTime":"2025-10-07T19:01:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.237595 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.255000 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bvwh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee9b984f-baa3-429f-b929-3d61d5e204bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bvwh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.271145 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.295495 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.313837 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.335844 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.338379 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.338639 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.338804 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.339012 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.339266 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:02Z","lastTransitionTime":"2025-10-07T19:01:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.350937 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2d0969fc24a0da54619501d29224a835772e85dee07940dee63ec5554f9891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82749148befd799cd1962c8be8688b1dd154b1481de391a25a399f2f2e640bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5c4jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.374126 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.386918 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.400294 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.416676 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.442047 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.442103 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.442120 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.442159 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.442201 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:02Z","lastTransitionTime":"2025-10-07T19:01:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.444971 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4c9c4fdf4419280e9c033097e4536865098594e3b8d25fd0918b45a8b436112\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"message\\\":\\\"36 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 19:00:44.208422 6236 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 19:00:44.208589 6236 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.208920 6236 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.209129 6236 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 19:00:44.209218 6236 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.209484 6236 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.209738 6236 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 19:00:44.209752 6236 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 19:00:44.209787 6236 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 19:00:44.209841 6236 factory.go:656] Stopping watch factory\\\\nI1007 19:00:44.209857 6236 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:01:01Z\\\",\\\"message\\\":\\\"ault network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z]\\\\nI1007 19:01:01.934399 6436 services_controller.go:451] Built service openshift-config-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-config-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-config-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.161\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]servi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:01:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.459443 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.473685 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.488748 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3c122804372bb1842a362067d274a1debd124b633605dbf43d21d52688ef96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.545633 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.545681 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.545702 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.545732 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.545755 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:02Z","lastTransitionTime":"2025-10-07T19:01:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.648685 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.648772 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.648794 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.648821 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.648840 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:02Z","lastTransitionTime":"2025-10-07T19:01:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.671855 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.686963 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.692838 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.712553 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.734206 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.752478 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.752529 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.752547 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.752570 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.752588 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:02Z","lastTransitionTime":"2025-10-07T19:01:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.773851 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.795138 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.811481 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.844376 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4c9c4fdf4419280e9c033097e4536865098594e3b8d25fd0918b45a8b436112\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"message\\\":\\\"36 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 19:00:44.208422 6236 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 19:00:44.208589 6236 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.208920 6236 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.209129 6236 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 19:00:44.209218 6236 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.209484 6236 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 19:00:44.209738 6236 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 19:00:44.209752 6236 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 19:00:44.209787 6236 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 19:00:44.209841 6236 factory.go:656] Stopping watch factory\\\\nI1007 19:00:44.209857 6236 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:01:01Z\\\",\\\"message\\\":\\\"ault network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z]\\\\nI1007 19:01:01.934399 6436 services_controller.go:451] Built service openshift-config-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-config-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-config-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.161\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]servi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:01:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.855633 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.855693 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.855718 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.855748 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.855771 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:02Z","lastTransitionTime":"2025-10-07T19:01:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.862483 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2d0969fc24a0da54619501d29224a835772e85dee07940dee63ec5554f9891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82749148befd799cd1962c8be8688b1dd154b1481de391a25a399f2f2e640bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5c4jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.884627 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.905423 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.923518 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3c122804372bb1842a362067d274a1debd124b633605dbf43d21d52688ef96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.947200 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.959349 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.959439 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.959465 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.959498 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.959523 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:02Z","lastTransitionTime":"2025-10-07T19:01:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.974300 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:02 crc kubenswrapper[4825]: I1007 19:01:02.993488 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:02Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.012392 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bvwh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee9b984f-baa3-429f-b929-3d61d5e204bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bvwh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:03Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.029050 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:03Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.052729 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://118b6f70278ab0bc5e10ad653b675b5790a88df552124be3fe509514c6d59a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:03Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.063461 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.063528 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.063557 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.063588 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.063612 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:03Z","lastTransitionTime":"2025-10-07T19:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.167149 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.167692 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.167885 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.168050 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.168208 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:03Z","lastTransitionTime":"2025-10-07T19:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.178769 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6lvdm_11546b62-cdda-449d-963e-418c2d4b6e46/ovnkube-controller/2.log" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.185569 4825 scope.go:117] "RemoveContainer" containerID="80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f" Oct 07 19:01:03 crc kubenswrapper[4825]: E1007 19:01:03.185848 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6lvdm_openshift-ovn-kubernetes(11546b62-cdda-449d-963e-418c2d4b6e46)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.206329 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:03Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.231269 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:03Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.251849 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3c122804372bb1842a362067d274a1debd124b633605dbf43d21d52688ef96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:03Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.267302 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:03Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.272080 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.272132 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.272151 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.272178 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.272196 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:03Z","lastTransitionTime":"2025-10-07T19:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.291582 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://118b6f70278ab0bc5e10ad653b675b5790a88df552124be3fe509514c6d59a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:03Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.309832 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:03Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.326086 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bvwh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee9b984f-baa3-429f-b929-3d61d5e204bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bvwh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:03Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.361773 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:03Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.375585 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.375629 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.375645 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.375668 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.375687 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:03Z","lastTransitionTime":"2025-10-07T19:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.382896 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:03Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.401914 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f30cd95-eb57-436d-bb25-5d14cc087820\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60f9716f9ac83aeb270019e1e2dfdc6d4aa8307f40949aeb39a95dd2134cc9cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://637e969a9a7909f0fc3e029f8bcf47c0c004ce9089ec75c8cc44adcdf333b1dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57ebc28bef30bc9400af5461cb62e963762d349457aada53e6d1e9d8777b0d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f83acfadc30a936b58da7008de9f678cdef4b6ab6650920b800b0bb14541490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f83acfadc30a936b58da7008de9f678cdef4b6ab6650920b800b0bb14541490\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:03Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.420369 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:03Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.444664 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:03Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.462089 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:03Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.476938 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:03Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.478333 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.478379 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.478397 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.478421 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.478440 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:03Z","lastTransitionTime":"2025-10-07T19:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.496785 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:03Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.513924 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:03Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.544553 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:01:01Z\\\",\\\"message\\\":\\\"ault network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z]\\\\nI1007 19:01:01.934399 6436 services_controller.go:451] Built service openshift-config-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-config-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-config-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.161\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]servi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:01:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6lvdm_openshift-ovn-kubernetes(11546b62-cdda-449d-963e-418c2d4b6e46)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:03Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.562590 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2d0969fc24a0da54619501d29224a835772e85dee07940dee63ec5554f9891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82749148befd799cd1962c8be8688b1dd154b1481de391a25a399f2f2e640bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5c4jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:03Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.582423 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.582487 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.582509 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.582535 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.582555 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:03Z","lastTransitionTime":"2025-10-07T19:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.685642 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.685758 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.685778 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.685801 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.685817 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:03Z","lastTransitionTime":"2025-10-07T19:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.731433 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.731605 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:03 crc kubenswrapper[4825]: E1007 19:01:03.731682 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:01:35.731649988 +0000 UTC m=+84.553688635 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.731763 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:03 crc kubenswrapper[4825]: E1007 19:01:03.731791 4825 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 19:01:03 crc kubenswrapper[4825]: E1007 19:01:03.731873 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 19:01:35.731849395 +0000 UTC m=+84.553888252 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 19:01:03 crc kubenswrapper[4825]: E1007 19:01:03.731951 4825 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 19:01:03 crc kubenswrapper[4825]: E1007 19:01:03.732066 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 19:01:35.732035661 +0000 UTC m=+84.554074498 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.789361 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.789420 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.789441 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.789461 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.789475 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:03Z","lastTransitionTime":"2025-10-07T19:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.795382 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.795412 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.795498 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.795699 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:03 crc kubenswrapper[4825]: E1007 19:01:03.795694 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:03 crc kubenswrapper[4825]: E1007 19:01:03.795886 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:03 crc kubenswrapper[4825]: E1007 19:01:03.796050 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:03 crc kubenswrapper[4825]: E1007 19:01:03.796199 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.833039 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.833126 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:03 crc kubenswrapper[4825]: E1007 19:01:03.833392 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 19:01:03 crc kubenswrapper[4825]: E1007 19:01:03.833431 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 19:01:03 crc kubenswrapper[4825]: E1007 19:01:03.833457 4825 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:01:03 crc kubenswrapper[4825]: E1007 19:01:03.833533 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 19:01:35.833510929 +0000 UTC m=+84.655549596 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:01:03 crc kubenswrapper[4825]: E1007 19:01:03.833746 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 19:01:03 crc kubenswrapper[4825]: E1007 19:01:03.833818 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 19:01:03 crc kubenswrapper[4825]: E1007 19:01:03.833836 4825 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:01:03 crc kubenswrapper[4825]: E1007 19:01:03.833928 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 19:01:35.833906482 +0000 UTC m=+84.655945129 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.893403 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.893494 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.893508 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.893532 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.893547 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:03Z","lastTransitionTime":"2025-10-07T19:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.997151 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.997280 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.997306 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.997342 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:03 crc kubenswrapper[4825]: I1007 19:01:03.997366 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:03Z","lastTransitionTime":"2025-10-07T19:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.101025 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.101090 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.101107 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.101131 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.101149 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:04Z","lastTransitionTime":"2025-10-07T19:01:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.204344 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.204419 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.204437 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.204461 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.204479 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:04Z","lastTransitionTime":"2025-10-07T19:01:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.307799 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.307859 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.307877 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.307903 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.307921 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:04Z","lastTransitionTime":"2025-10-07T19:01:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.411314 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.411378 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.411389 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.411410 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.411424 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:04Z","lastTransitionTime":"2025-10-07T19:01:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.514222 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.514303 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.514320 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.514341 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.514354 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:04Z","lastTransitionTime":"2025-10-07T19:01:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.617817 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.617876 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.617889 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.617909 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.617923 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:04Z","lastTransitionTime":"2025-10-07T19:01:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.728360 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.728429 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.728449 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.728475 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.728495 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:04Z","lastTransitionTime":"2025-10-07T19:01:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.832367 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.832435 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.832452 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.832476 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.832497 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:04Z","lastTransitionTime":"2025-10-07T19:01:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.935649 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.935803 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.935824 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.935851 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:04 crc kubenswrapper[4825]: I1007 19:01:04.935867 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:04Z","lastTransitionTime":"2025-10-07T19:01:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.038174 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.038224 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.038254 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.038270 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.038282 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:05Z","lastTransitionTime":"2025-10-07T19:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.141359 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.141437 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.141451 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.141471 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.141483 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:05Z","lastTransitionTime":"2025-10-07T19:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.244636 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.244702 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.244721 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.244751 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.244771 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:05Z","lastTransitionTime":"2025-10-07T19:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.347732 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.347798 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.347815 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.347836 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.347851 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:05Z","lastTransitionTime":"2025-10-07T19:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.450660 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.450712 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.450726 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.450744 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.450757 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:05Z","lastTransitionTime":"2025-10-07T19:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.553942 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.553979 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.553987 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.554001 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.554012 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:05Z","lastTransitionTime":"2025-10-07T19:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.656677 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.656746 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.656765 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.656787 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.656803 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:05Z","lastTransitionTime":"2025-10-07T19:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.759430 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.759501 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.759519 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.759546 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.759565 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:05Z","lastTransitionTime":"2025-10-07T19:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.794990 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.795056 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.795093 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.795023 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:05 crc kubenswrapper[4825]: E1007 19:01:05.795371 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:05 crc kubenswrapper[4825]: E1007 19:01:05.795200 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:05 crc kubenswrapper[4825]: E1007 19:01:05.795571 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:05 crc kubenswrapper[4825]: E1007 19:01:05.795694 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.862172 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.862217 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.862244 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.862263 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.862277 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:05Z","lastTransitionTime":"2025-10-07T19:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.965521 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.965596 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.965615 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.965644 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:05 crc kubenswrapper[4825]: I1007 19:01:05.965664 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:05Z","lastTransitionTime":"2025-10-07T19:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.067674 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.067718 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.067730 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.067745 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.067759 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:06Z","lastTransitionTime":"2025-10-07T19:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.170736 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.170799 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.170821 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.170848 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.170865 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:06Z","lastTransitionTime":"2025-10-07T19:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.273709 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.273784 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.273809 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.273857 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.273881 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:06Z","lastTransitionTime":"2025-10-07T19:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.376960 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.377010 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.377021 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.377038 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.377051 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:06Z","lastTransitionTime":"2025-10-07T19:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.479782 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.479836 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.479847 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.479860 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.479871 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:06Z","lastTransitionTime":"2025-10-07T19:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.583364 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.583431 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.583448 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.583473 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.583491 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:06Z","lastTransitionTime":"2025-10-07T19:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.686733 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.686816 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.686837 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.686864 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.686882 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:06Z","lastTransitionTime":"2025-10-07T19:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.790776 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.790875 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.790903 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.790933 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.790954 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:06Z","lastTransitionTime":"2025-10-07T19:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.894197 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.894393 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.894427 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.894460 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.894481 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:06Z","lastTransitionTime":"2025-10-07T19:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.997934 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.998005 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.998052 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.998082 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:06 crc kubenswrapper[4825]: I1007 19:01:06.998104 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:06Z","lastTransitionTime":"2025-10-07T19:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.104746 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.104806 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.104825 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.104850 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.104868 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:07Z","lastTransitionTime":"2025-10-07T19:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.206573 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.206656 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.206672 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.206699 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.206716 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:07Z","lastTransitionTime":"2025-10-07T19:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.309746 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.309814 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.309826 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.309848 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.309862 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:07Z","lastTransitionTime":"2025-10-07T19:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.413085 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.413149 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.413160 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.413178 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.413188 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:07Z","lastTransitionTime":"2025-10-07T19:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.515582 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.515630 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.515640 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.515659 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.515671 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:07Z","lastTransitionTime":"2025-10-07T19:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.618360 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.618418 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.618428 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.618448 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.618460 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:07Z","lastTransitionTime":"2025-10-07T19:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.721687 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.721779 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.721802 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.721835 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.721858 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:07Z","lastTransitionTime":"2025-10-07T19:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.795160 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.795324 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.795394 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:07 crc kubenswrapper[4825]: E1007 19:01:07.795476 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:07 crc kubenswrapper[4825]: E1007 19:01:07.795617 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:07 crc kubenswrapper[4825]: E1007 19:01:07.795728 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.795766 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:07 crc kubenswrapper[4825]: E1007 19:01:07.795888 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.825630 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.825679 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.825690 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.825709 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.825725 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:07Z","lastTransitionTime":"2025-10-07T19:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.929065 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.929173 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.929201 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.929280 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:07 crc kubenswrapper[4825]: I1007 19:01:07.929309 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:07Z","lastTransitionTime":"2025-10-07T19:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.032467 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.032521 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.032555 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.032573 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.032585 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:08Z","lastTransitionTime":"2025-10-07T19:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.136628 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.136701 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.136720 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.136746 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.136765 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:08Z","lastTransitionTime":"2025-10-07T19:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.238773 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.238835 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.238846 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.238860 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.238870 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:08Z","lastTransitionTime":"2025-10-07T19:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.342868 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.342938 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.342956 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.342984 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.343003 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:08Z","lastTransitionTime":"2025-10-07T19:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.446309 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.446377 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.446394 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.446422 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.446441 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:08Z","lastTransitionTime":"2025-10-07T19:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.549272 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.549368 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.549388 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.549415 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.549435 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:08Z","lastTransitionTime":"2025-10-07T19:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.651989 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.652058 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.652076 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.652101 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.652120 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:08Z","lastTransitionTime":"2025-10-07T19:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.756045 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.756117 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.756141 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.756204 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.756272 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:08Z","lastTransitionTime":"2025-10-07T19:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.845331 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.845393 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.845409 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.845430 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.845447 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:08Z","lastTransitionTime":"2025-10-07T19:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:08 crc kubenswrapper[4825]: E1007 19:01:08.859066 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:08Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.863042 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.863090 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.863105 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.863122 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.863133 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:08Z","lastTransitionTime":"2025-10-07T19:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:08 crc kubenswrapper[4825]: E1007 19:01:08.881224 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:08Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.886103 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.886145 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.886157 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.886177 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.886190 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:08Z","lastTransitionTime":"2025-10-07T19:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:08 crc kubenswrapper[4825]: E1007 19:01:08.903312 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:08Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.911164 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.911287 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.911332 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.911386 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.911433 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:08Z","lastTransitionTime":"2025-10-07T19:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:08 crc kubenswrapper[4825]: E1007 19:01:08.928697 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:08Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.933664 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.933732 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.933751 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.933838 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.933940 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:08Z","lastTransitionTime":"2025-10-07T19:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:08 crc kubenswrapper[4825]: E1007 19:01:08.950951 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:08Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:08 crc kubenswrapper[4825]: E1007 19:01:08.951593 4825 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.955704 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.955787 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.955808 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.955836 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:08 crc kubenswrapper[4825]: I1007 19:01:08.955856 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:08Z","lastTransitionTime":"2025-10-07T19:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.059524 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.059579 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.059596 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.059622 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.059639 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:09Z","lastTransitionTime":"2025-10-07T19:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.162860 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.162919 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.162938 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.162964 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.162982 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:09Z","lastTransitionTime":"2025-10-07T19:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.266321 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.266390 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.266407 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.266429 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.266446 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:09Z","lastTransitionTime":"2025-10-07T19:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.369301 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.369395 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.369414 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.369440 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.369456 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:09Z","lastTransitionTime":"2025-10-07T19:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.472619 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.472711 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.472730 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.472760 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.472778 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:09Z","lastTransitionTime":"2025-10-07T19:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.575526 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.575628 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.575652 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.575684 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.575702 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:09Z","lastTransitionTime":"2025-10-07T19:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.679174 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.679275 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.679295 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.679327 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.679345 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:09Z","lastTransitionTime":"2025-10-07T19:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.783261 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.783321 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.783331 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.783353 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.783363 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:09Z","lastTransitionTime":"2025-10-07T19:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.795029 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.795120 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.795141 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.795024 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:09 crc kubenswrapper[4825]: E1007 19:01:09.795395 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:09 crc kubenswrapper[4825]: E1007 19:01:09.795514 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:09 crc kubenswrapper[4825]: E1007 19:01:09.795586 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:09 crc kubenswrapper[4825]: E1007 19:01:09.795702 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.886498 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.886547 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.886557 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.886574 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.886586 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:09Z","lastTransitionTime":"2025-10-07T19:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.990050 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.990130 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.990149 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.990173 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:09 crc kubenswrapper[4825]: I1007 19:01:09.990193 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:09Z","lastTransitionTime":"2025-10-07T19:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.093187 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.093304 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.093321 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.093344 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.093372 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:10Z","lastTransitionTime":"2025-10-07T19:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.196545 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.196709 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.196741 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.196925 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.196968 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:10Z","lastTransitionTime":"2025-10-07T19:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.299327 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.299363 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.299374 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.299389 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.299400 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:10Z","lastTransitionTime":"2025-10-07T19:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.402518 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.402575 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.402592 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.402615 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.402632 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:10Z","lastTransitionTime":"2025-10-07T19:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.505216 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.505306 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.505325 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.505349 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.505368 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:10Z","lastTransitionTime":"2025-10-07T19:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.608065 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.608144 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.608162 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.608185 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.608202 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:10Z","lastTransitionTime":"2025-10-07T19:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.711082 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.711144 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.711168 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.711195 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.711216 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:10Z","lastTransitionTime":"2025-10-07T19:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.814916 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.814986 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.815004 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.815027 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.815045 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:10Z","lastTransitionTime":"2025-10-07T19:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.918302 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.918361 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.918382 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.918415 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:10 crc kubenswrapper[4825]: I1007 19:01:10.918440 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:10Z","lastTransitionTime":"2025-10-07T19:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.021626 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.021706 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.021728 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.021755 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.021774 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:11Z","lastTransitionTime":"2025-10-07T19:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.125033 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.125093 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.125115 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.125143 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.125163 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:11Z","lastTransitionTime":"2025-10-07T19:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.228159 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.228284 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.228311 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.228343 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.228367 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:11Z","lastTransitionTime":"2025-10-07T19:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.330986 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.331038 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.331060 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.331088 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.331109 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:11Z","lastTransitionTime":"2025-10-07T19:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.434027 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.434096 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.434113 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.434141 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.434165 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:11Z","lastTransitionTime":"2025-10-07T19:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.537323 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.537399 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.537438 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.537469 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.537493 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:11Z","lastTransitionTime":"2025-10-07T19:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.640192 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.640297 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.640320 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.640343 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.640360 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:11Z","lastTransitionTime":"2025-10-07T19:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.743078 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.743139 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.743161 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.743182 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.743198 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:11Z","lastTransitionTime":"2025-10-07T19:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.794495 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.794520 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.794566 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.794646 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:11 crc kubenswrapper[4825]: E1007 19:01:11.794638 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:11 crc kubenswrapper[4825]: E1007 19:01:11.794771 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:11 crc kubenswrapper[4825]: E1007 19:01:11.794824 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:11 crc kubenswrapper[4825]: E1007 19:01:11.794923 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.814985 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.832445 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f30cd95-eb57-436d-bb25-5d14cc087820\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60f9716f9ac83aeb270019e1e2dfdc6d4aa8307f40949aeb39a95dd2134cc9cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://637e969a9a7909f0fc3e029f8bcf47c0c004ce9089ec75c8cc44adcdf333b1dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57ebc28bef30bc9400af5461cb62e963762d349457aada53e6d1e9d8777b0d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f83acfadc30a936b58da7008de9f678cdef4b6ab6650920b800b0bb14541490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f83acfadc30a936b58da7008de9f678cdef4b6ab6650920b800b0bb14541490\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.846125 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.846210 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.846270 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.846298 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.846318 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:11Z","lastTransitionTime":"2025-10-07T19:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.848570 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.867293 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.894055 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.911470 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.932762 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.949664 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.949773 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.949801 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.949948 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.949991 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:11Z","lastTransitionTime":"2025-10-07T19:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.956223 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:01:01Z\\\",\\\"message\\\":\\\"ault network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z]\\\\nI1007 19:01:01.934399 6436 services_controller.go:451] Built service openshift-config-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-config-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-config-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.161\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]servi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:01:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6lvdm_openshift-ovn-kubernetes(11546b62-cdda-449d-963e-418c2d4b6e46)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.973985 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2d0969fc24a0da54619501d29224a835772e85dee07940dee63ec5554f9891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82749148befd799cd1962c8be8688b1dd154b1481de391a25a399f2f2e640bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5c4jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:11 crc kubenswrapper[4825]: I1007 19:01:11.992103 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:11Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.006811 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.019460 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3c122804372bb1842a362067d274a1debd124b633605dbf43d21d52688ef96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.033659 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.049452 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.053577 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.053622 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.053635 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.053653 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.053664 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:12Z","lastTransitionTime":"2025-10-07T19:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.063572 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.075065 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bvwh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee9b984f-baa3-429f-b929-3d61d5e204bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bvwh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.086876 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.102988 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://118b6f70278ab0bc5e10ad653b675b5790a88df552124be3fe509514c6d59a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:12Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.155625 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.155661 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.155669 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.155683 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.155695 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:12Z","lastTransitionTime":"2025-10-07T19:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.259088 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.259132 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.259146 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.259171 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.259183 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:12Z","lastTransitionTime":"2025-10-07T19:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.362336 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.362625 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.362787 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.362939 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.363070 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:12Z","lastTransitionTime":"2025-10-07T19:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.466812 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.466884 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.466906 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.466933 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.466952 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:12Z","lastTransitionTime":"2025-10-07T19:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.570642 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.570709 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.570723 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.570744 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.570757 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:12Z","lastTransitionTime":"2025-10-07T19:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.674180 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.674248 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.674277 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.674301 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.674316 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:12Z","lastTransitionTime":"2025-10-07T19:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.778068 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.778143 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.778158 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.778183 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.778198 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:12Z","lastTransitionTime":"2025-10-07T19:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.880930 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.881018 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.881032 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.881055 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.881068 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:12Z","lastTransitionTime":"2025-10-07T19:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.984681 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.984756 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.984769 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.984793 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:12 crc kubenswrapper[4825]: I1007 19:01:12.984810 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:12Z","lastTransitionTime":"2025-10-07T19:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.088032 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.088098 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.088110 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.088132 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.088151 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:13Z","lastTransitionTime":"2025-10-07T19:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.191866 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.191936 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.191954 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.191983 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.192004 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:13Z","lastTransitionTime":"2025-10-07T19:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.294997 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.295178 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.295208 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.295277 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.295315 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:13Z","lastTransitionTime":"2025-10-07T19:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.398920 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.398988 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.399006 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.399033 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.399052 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:13Z","lastTransitionTime":"2025-10-07T19:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.502400 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.502466 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.502486 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.502518 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.502537 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:13Z","lastTransitionTime":"2025-10-07T19:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.606059 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.606137 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.606165 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.606200 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.606272 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:13Z","lastTransitionTime":"2025-10-07T19:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.710058 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.710109 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.710126 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.710149 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.710165 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:13Z","lastTransitionTime":"2025-10-07T19:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.794981 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.795040 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.795101 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.795128 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:13 crc kubenswrapper[4825]: E1007 19:01:13.795341 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:13 crc kubenswrapper[4825]: E1007 19:01:13.795485 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:13 crc kubenswrapper[4825]: E1007 19:01:13.795632 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:13 crc kubenswrapper[4825]: E1007 19:01:13.795762 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.812644 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.812688 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.812698 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.812713 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.812724 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:13Z","lastTransitionTime":"2025-10-07T19:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.915878 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.915942 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.915961 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.915988 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:13 crc kubenswrapper[4825]: I1007 19:01:13.916006 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:13Z","lastTransitionTime":"2025-10-07T19:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.018802 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.018866 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.018885 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.018909 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.018926 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:14Z","lastTransitionTime":"2025-10-07T19:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.122127 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.122192 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.122210 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.122263 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.122289 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:14Z","lastTransitionTime":"2025-10-07T19:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.225642 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.225726 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.225752 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.225779 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.225797 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:14Z","lastTransitionTime":"2025-10-07T19:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.328922 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.328984 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.329001 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.329026 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.329049 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:14Z","lastTransitionTime":"2025-10-07T19:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.432266 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.432333 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.432351 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.432381 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.432400 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:14Z","lastTransitionTime":"2025-10-07T19:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.536071 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.536127 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.536141 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.536162 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.536184 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:14Z","lastTransitionTime":"2025-10-07T19:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.639132 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.639172 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.639187 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.639205 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.639218 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:14Z","lastTransitionTime":"2025-10-07T19:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.741681 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.741730 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.741740 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.741762 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.741774 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:14Z","lastTransitionTime":"2025-10-07T19:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.847382 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.847472 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.847488 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.847508 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.847532 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:14Z","lastTransitionTime":"2025-10-07T19:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.950413 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.950483 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.950497 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.950515 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:14 crc kubenswrapper[4825]: I1007 19:01:14.950528 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:14Z","lastTransitionTime":"2025-10-07T19:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.053258 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.053332 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.053351 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.053376 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.053395 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:15Z","lastTransitionTime":"2025-10-07T19:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.194123 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.194161 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.194171 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.194186 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.194195 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:15Z","lastTransitionTime":"2025-10-07T19:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.296818 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.296867 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.296882 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.296900 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.296914 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:15Z","lastTransitionTime":"2025-10-07T19:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.399655 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.399700 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.399740 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.399758 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.399771 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:15Z","lastTransitionTime":"2025-10-07T19:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.502681 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.502721 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.502731 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.502746 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.502756 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:15Z","lastTransitionTime":"2025-10-07T19:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.605338 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.605370 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.605378 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.605407 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.605418 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:15Z","lastTransitionTime":"2025-10-07T19:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.708632 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.708717 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.708752 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.708785 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.708810 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:15Z","lastTransitionTime":"2025-10-07T19:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.794671 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.794713 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.794804 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:15 crc kubenswrapper[4825]: E1007 19:01:15.794803 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:15 crc kubenswrapper[4825]: E1007 19:01:15.794917 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:15 crc kubenswrapper[4825]: E1007 19:01:15.795001 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.795061 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:15 crc kubenswrapper[4825]: E1007 19:01:15.795119 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.811582 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.811618 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.811627 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.811642 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.811651 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:15Z","lastTransitionTime":"2025-10-07T19:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.914129 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.914196 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.914213 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.914263 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:15 crc kubenswrapper[4825]: I1007 19:01:15.914279 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:15Z","lastTransitionTime":"2025-10-07T19:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.017492 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.017569 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.017596 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.017626 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.017651 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:16Z","lastTransitionTime":"2025-10-07T19:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.121322 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.121383 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.121403 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.121431 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.121450 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:16Z","lastTransitionTime":"2025-10-07T19:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.224871 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.225004 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.225078 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.225143 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.225213 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:16Z","lastTransitionTime":"2025-10-07T19:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.328331 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.328389 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.328405 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.328427 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.328442 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:16Z","lastTransitionTime":"2025-10-07T19:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.430281 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.430344 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.430361 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.430383 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.430400 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:16Z","lastTransitionTime":"2025-10-07T19:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.532808 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.532847 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.532859 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.532875 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.532886 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:16Z","lastTransitionTime":"2025-10-07T19:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.635399 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.635447 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.635461 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.635481 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.635500 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:16Z","lastTransitionTime":"2025-10-07T19:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.738011 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.738059 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.738073 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.738089 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.738101 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:16Z","lastTransitionTime":"2025-10-07T19:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.840982 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.841050 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.841077 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.841103 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.841126 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:16Z","lastTransitionTime":"2025-10-07T19:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.944677 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.944739 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.944756 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.944780 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:16 crc kubenswrapper[4825]: I1007 19:01:16.944796 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:16Z","lastTransitionTime":"2025-10-07T19:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.047072 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.047125 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.047142 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.047167 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.047183 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:17Z","lastTransitionTime":"2025-10-07T19:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.150862 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.150905 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.150916 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.150931 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.150942 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:17Z","lastTransitionTime":"2025-10-07T19:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.253158 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.253843 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.253863 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.253880 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.253893 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:17Z","lastTransitionTime":"2025-10-07T19:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.357139 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.357518 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.357665 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.357811 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.357951 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:17Z","lastTransitionTime":"2025-10-07T19:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.460951 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.460998 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.461013 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.461034 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.461046 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:17Z","lastTransitionTime":"2025-10-07T19:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.563199 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.563257 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.563268 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.563284 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.563295 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:17Z","lastTransitionTime":"2025-10-07T19:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.667190 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.667274 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.667291 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.667317 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.667334 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:17Z","lastTransitionTime":"2025-10-07T19:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.770726 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.770785 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.770804 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.770831 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.770850 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:17Z","lastTransitionTime":"2025-10-07T19:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.795191 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.795300 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.795191 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:17 crc kubenswrapper[4825]: E1007 19:01:17.795452 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:17 crc kubenswrapper[4825]: E1007 19:01:17.795560 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:17 crc kubenswrapper[4825]: E1007 19:01:17.795720 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.795821 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:17 crc kubenswrapper[4825]: E1007 19:01:17.796044 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.873850 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.873913 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.873937 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.873968 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.873993 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:17Z","lastTransitionTime":"2025-10-07T19:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.977010 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.977121 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.977148 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.977177 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:17 crc kubenswrapper[4825]: I1007 19:01:17.977200 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:17Z","lastTransitionTime":"2025-10-07T19:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.079835 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.079879 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.079894 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.079910 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.079923 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:18Z","lastTransitionTime":"2025-10-07T19:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.123806 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs\") pod \"network-metrics-daemon-bvwh2\" (UID: \"ee9b984f-baa3-429f-b929-3d61d5e204bc\") " pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:18 crc kubenswrapper[4825]: E1007 19:01:18.124004 4825 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 19:01:18 crc kubenswrapper[4825]: E1007 19:01:18.124061 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs podName:ee9b984f-baa3-429f-b929-3d61d5e204bc nodeName:}" failed. No retries permitted until 2025-10-07 19:01:50.124042888 +0000 UTC m=+98.946081535 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs") pod "network-metrics-daemon-bvwh2" (UID: "ee9b984f-baa3-429f-b929-3d61d5e204bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.183091 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.183129 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.183141 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.183157 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.183168 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:18Z","lastTransitionTime":"2025-10-07T19:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.284965 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.285272 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.285288 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.285303 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.285313 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:18Z","lastTransitionTime":"2025-10-07T19:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.388101 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.388192 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.388208 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.388252 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.388266 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:18Z","lastTransitionTime":"2025-10-07T19:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.490816 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.490872 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.490884 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.490901 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.490914 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:18Z","lastTransitionTime":"2025-10-07T19:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.593450 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.593496 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.593507 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.593522 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.593532 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:18Z","lastTransitionTime":"2025-10-07T19:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.695929 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.695978 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.695987 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.696001 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.696011 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:18Z","lastTransitionTime":"2025-10-07T19:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.795424 4825 scope.go:117] "RemoveContainer" containerID="80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f" Oct 07 19:01:18 crc kubenswrapper[4825]: E1007 19:01:18.795658 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6lvdm_openshift-ovn-kubernetes(11546b62-cdda-449d-963e-418c2d4b6e46)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.798450 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.798500 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.798512 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.798530 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.798543 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:18Z","lastTransitionTime":"2025-10-07T19:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.901076 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.901108 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.901116 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.901133 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:18 crc kubenswrapper[4825]: I1007 19:01:18.901141 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:18Z","lastTransitionTime":"2025-10-07T19:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.004791 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.004859 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.004875 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.004899 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.004916 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:19Z","lastTransitionTime":"2025-10-07T19:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.107632 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.107699 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.107722 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.107773 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.107797 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:19Z","lastTransitionTime":"2025-10-07T19:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.167126 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.167201 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.167213 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.167249 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.167268 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:19Z","lastTransitionTime":"2025-10-07T19:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:19 crc kubenswrapper[4825]: E1007 19:01:19.180923 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:19Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.184884 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.184917 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.184927 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.184942 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.184953 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:19Z","lastTransitionTime":"2025-10-07T19:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:19 crc kubenswrapper[4825]: E1007 19:01:19.196544 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:19Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.201441 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.201492 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.201509 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.201532 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.201553 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:19Z","lastTransitionTime":"2025-10-07T19:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:19 crc kubenswrapper[4825]: E1007 19:01:19.222520 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:19Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.227491 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.227544 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.227559 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.227582 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.227597 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:19Z","lastTransitionTime":"2025-10-07T19:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:19 crc kubenswrapper[4825]: E1007 19:01:19.249172 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:19Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.254010 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.254049 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.254062 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.254082 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.254096 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:19Z","lastTransitionTime":"2025-10-07T19:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:19 crc kubenswrapper[4825]: E1007 19:01:19.267164 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:19Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:19 crc kubenswrapper[4825]: E1007 19:01:19.267314 4825 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.269168 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.269213 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.269236 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.269254 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.269265 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:19Z","lastTransitionTime":"2025-10-07T19:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.372860 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.372926 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.372953 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.372985 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.373004 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:19Z","lastTransitionTime":"2025-10-07T19:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.476035 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.476091 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.476103 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.476124 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.476138 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:19Z","lastTransitionTime":"2025-10-07T19:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.578896 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.578939 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.578947 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.578962 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.578973 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:19Z","lastTransitionTime":"2025-10-07T19:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.682275 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.682325 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.682344 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.682368 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.682387 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:19Z","lastTransitionTime":"2025-10-07T19:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.785362 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.785408 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.785418 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.785435 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.785445 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:19Z","lastTransitionTime":"2025-10-07T19:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.794819 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.794832 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.794890 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.794982 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:19 crc kubenswrapper[4825]: E1007 19:01:19.795071 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:19 crc kubenswrapper[4825]: E1007 19:01:19.795160 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:19 crc kubenswrapper[4825]: E1007 19:01:19.795409 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:19 crc kubenswrapper[4825]: E1007 19:01:19.795474 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.888093 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.888133 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.888141 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.888155 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.888163 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:19Z","lastTransitionTime":"2025-10-07T19:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.990510 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.990565 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.990580 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.990600 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:19 crc kubenswrapper[4825]: I1007 19:01:19.990613 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:19Z","lastTransitionTime":"2025-10-07T19:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.093564 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.093650 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.093683 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.093719 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.093741 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:20Z","lastTransitionTime":"2025-10-07T19:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.196887 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.196939 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.196950 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.196974 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.196986 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:20Z","lastTransitionTime":"2025-10-07T19:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.243476 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zk9x9_44f62e96-26a6-4bfe-8e8c-6884216bd363/kube-multus/0.log" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.243586 4825 generic.go:334] "Generic (PLEG): container finished" podID="44f62e96-26a6-4bfe-8e8c-6884216bd363" containerID="ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71" exitCode=1 Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.243648 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zk9x9" event={"ID":"44f62e96-26a6-4bfe-8e8c-6884216bd363","Type":"ContainerDied","Data":"ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71"} Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.244437 4825 scope.go:117] "RemoveContainer" containerID="ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.257606 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:20Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.270935 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:20Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.281501 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:20Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.297704 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:01:01Z\\\",\\\"message\\\":\\\"ault network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z]\\\\nI1007 19:01:01.934399 6436 services_controller.go:451] Built service openshift-config-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-config-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-config-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.161\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]servi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:01:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6lvdm_openshift-ovn-kubernetes(11546b62-cdda-449d-963e-418c2d4b6e46)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:20Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.299636 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.299686 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.299699 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.299719 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.299732 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:20Z","lastTransitionTime":"2025-10-07T19:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.308732 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2d0969fc24a0da54619501d29224a835772e85dee07940dee63ec5554f9891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82749148befd799cd1962c8be8688b1dd154b1481de391a25a399f2f2e640bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5c4jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:20Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.324062 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:20Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.339632 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"2025-10-07T19:00:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_357741bc-6702-4d9e-94ce-a129a4f98853\\\\n2025-10-07T19:00:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_357741bc-6702-4d9e-94ce-a129a4f98853 to /host/opt/cni/bin/\\\\n2025-10-07T19:00:34Z [verbose] multus-daemon started\\\\n2025-10-07T19:00:34Z [verbose] Readiness Indicator file check\\\\n2025-10-07T19:01:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:20Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.352512 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3c122804372bb1842a362067d274a1debd124b633605dbf43d21d52688ef96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:20Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.367958 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:20Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.383973 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://118b6f70278ab0bc5e10ad653b675b5790a88df552124be3fe509514c6d59a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:20Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.398870 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:20Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.403505 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.403537 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.403548 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.403566 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.403581 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:20Z","lastTransitionTime":"2025-10-07T19:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.414565 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bvwh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee9b984f-baa3-429f-b929-3d61d5e204bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bvwh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:20Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.430276 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:20Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.452094 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:20Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.467817 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:20Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.506052 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.506117 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.506135 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.506159 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.506175 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:20Z","lastTransitionTime":"2025-10-07T19:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.519981 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f30cd95-eb57-436d-bb25-5d14cc087820\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60f9716f9ac83aeb270019e1e2dfdc6d4aa8307f40949aeb39a95dd2134cc9cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://637e969a9a7909f0fc3e029f8bcf47c0c004ce9089ec75c8cc44adcdf333b1dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57ebc28bef30bc9400af5461cb62e963762d349457aada53e6d1e9d8777b0d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f83acfadc30a936b58da7008de9f678cdef4b6ab6650920b800b0bb14541490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f83acfadc30a936b58da7008de9f678cdef4b6ab6650920b800b0bb14541490\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:20Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.548988 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:20Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.568942 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:20Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.608489 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.608533 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.608543 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.608563 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.608576 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:20Z","lastTransitionTime":"2025-10-07T19:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.711799 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.711872 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.711883 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.711907 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.711923 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:20Z","lastTransitionTime":"2025-10-07T19:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.815161 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.815221 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.815246 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.815295 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.815314 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:20Z","lastTransitionTime":"2025-10-07T19:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.918280 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.918325 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.918337 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.918356 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:20 crc kubenswrapper[4825]: I1007 19:01:20.918369 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:20Z","lastTransitionTime":"2025-10-07T19:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.020540 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.020621 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.020639 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.020669 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.020684 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:21Z","lastTransitionTime":"2025-10-07T19:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.124319 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.124384 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.124402 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.124431 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.124447 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:21Z","lastTransitionTime":"2025-10-07T19:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.227768 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.227846 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.227859 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.227880 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.227898 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:21Z","lastTransitionTime":"2025-10-07T19:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.249007 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zk9x9_44f62e96-26a6-4bfe-8e8c-6884216bd363/kube-multus/0.log" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.249085 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zk9x9" event={"ID":"44f62e96-26a6-4bfe-8e8c-6884216bd363","Type":"ContainerStarted","Data":"58e5cbd6853b21641655497f3c250645e7ea086a9dfe7d7e6b941b1cdabc5953"} Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.264568 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.278909 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.298808 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.320206 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:01:01Z\\\",\\\"message\\\":\\\"ault network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z]\\\\nI1007 19:01:01.934399 6436 services_controller.go:451] Built service openshift-config-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-config-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-config-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.161\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]servi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:01:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6lvdm_openshift-ovn-kubernetes(11546b62-cdda-449d-963e-418c2d4b6e46)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.330607 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.330664 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.330681 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.330703 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.330716 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:21Z","lastTransitionTime":"2025-10-07T19:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.338078 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2d0969fc24a0da54619501d29224a835772e85dee07940dee63ec5554f9891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82749148befd799cd1962c8be8688b1dd154b1481de391a25a399f2f2e640bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5c4jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.353293 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.374324 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e5cbd6853b21641655497f3c250645e7ea086a9dfe7d7e6b941b1cdabc5953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"2025-10-07T19:00:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_357741bc-6702-4d9e-94ce-a129a4f98853\\\\n2025-10-07T19:00:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_357741bc-6702-4d9e-94ce-a129a4f98853 to /host/opt/cni/bin/\\\\n2025-10-07T19:00:34Z [verbose] multus-daemon started\\\\n2025-10-07T19:00:34Z [verbose] Readiness Indicator file check\\\\n2025-10-07T19:01:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.386463 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3c122804372bb1842a362067d274a1debd124b633605dbf43d21d52688ef96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.403686 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.418545 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://118b6f70278ab0bc5e10ad653b675b5790a88df552124be3fe509514c6d59a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.430877 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.432681 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.432725 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.432736 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.432754 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.432766 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:21Z","lastTransitionTime":"2025-10-07T19:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.441157 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bvwh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee9b984f-baa3-429f-b929-3d61d5e204bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bvwh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.455454 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.479281 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.493141 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.509687 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f30cd95-eb57-436d-bb25-5d14cc087820\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60f9716f9ac83aeb270019e1e2dfdc6d4aa8307f40949aeb39a95dd2134cc9cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://637e969a9a7909f0fc3e029f8bcf47c0c004ce9089ec75c8cc44adcdf333b1dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57ebc28bef30bc9400af5461cb62e963762d349457aada53e6d1e9d8777b0d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f83acfadc30a936b58da7008de9f678cdef4b6ab6650920b800b0bb14541490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f83acfadc30a936b58da7008de9f678cdef4b6ab6650920b800b0bb14541490\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.523697 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.535303 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.535347 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.535357 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.535375 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.535388 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:21Z","lastTransitionTime":"2025-10-07T19:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.538019 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.637437 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.637630 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.637650 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.637666 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.637676 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:21Z","lastTransitionTime":"2025-10-07T19:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.740369 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.740404 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.740414 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.740430 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.740441 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:21Z","lastTransitionTime":"2025-10-07T19:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.795318 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:21 crc kubenswrapper[4825]: E1007 19:01:21.795790 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.795441 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:21 crc kubenswrapper[4825]: E1007 19:01:21.796038 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.795331 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:21 crc kubenswrapper[4825]: E1007 19:01:21.796246 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.795445 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:21 crc kubenswrapper[4825]: E1007 19:01:21.796435 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.805832 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3c122804372bb1842a362067d274a1debd124b633605dbf43d21d52688ef96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.820066 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.837791 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e5cbd6853b21641655497f3c250645e7ea086a9dfe7d7e6b941b1cdabc5953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"2025-10-07T19:00:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_357741bc-6702-4d9e-94ce-a129a4f98853\\\\n2025-10-07T19:00:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_357741bc-6702-4d9e-94ce-a129a4f98853 to /host/opt/cni/bin/\\\\n2025-10-07T19:00:34Z [verbose] multus-daemon started\\\\n2025-10-07T19:00:34Z [verbose] Readiness Indicator file check\\\\n2025-10-07T19:01:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.842688 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.842741 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.842758 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.842783 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.842802 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:21Z","lastTransitionTime":"2025-10-07T19:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.851816 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.864222 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bvwh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee9b984f-baa3-429f-b929-3d61d5e204bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bvwh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.881730 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.900563 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://118b6f70278ab0bc5e10ad653b675b5790a88df552124be3fe509514c6d59a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.916309 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.928141 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f30cd95-eb57-436d-bb25-5d14cc087820\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60f9716f9ac83aeb270019e1e2dfdc6d4aa8307f40949aeb39a95dd2134cc9cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://637e969a9a7909f0fc3e029f8bcf47c0c004ce9089ec75c8cc44adcdf333b1dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57ebc28bef30bc9400af5461cb62e963762d349457aada53e6d1e9d8777b0d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f83acfadc30a936b58da7008de9f678cdef4b6ab6650920b800b0bb14541490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f83acfadc30a936b58da7008de9f678cdef4b6ab6650920b800b0bb14541490\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.943050 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.945123 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.945251 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.945326 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.945390 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.945454 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:21Z","lastTransitionTime":"2025-10-07T19:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.960507 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.983492 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:21 crc kubenswrapper[4825]: I1007 19:01:21.999532 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:21Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.015807 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.045373 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:01:01Z\\\",\\\"message\\\":\\\"ault network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z]\\\\nI1007 19:01:01.934399 6436 services_controller.go:451] Built service openshift-config-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-config-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-config-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.161\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]servi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:01:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6lvdm_openshift-ovn-kubernetes(11546b62-cdda-449d-963e-418c2d4b6e46)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.048414 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.048449 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.048459 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.048473 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.048485 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:22Z","lastTransitionTime":"2025-10-07T19:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.060605 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2d0969fc24a0da54619501d29224a835772e85dee07940dee63ec5554f9891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82749148befd799cd1962c8be8688b1dd154b1481de391a25a399f2f2e640bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5c4jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.078710 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.090805 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:22Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.150809 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.150853 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.150864 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.150881 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.150893 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:22Z","lastTransitionTime":"2025-10-07T19:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.252947 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.252985 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.252998 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.253015 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.253025 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:22Z","lastTransitionTime":"2025-10-07T19:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.355528 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.355575 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.355583 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.355600 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.355610 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:22Z","lastTransitionTime":"2025-10-07T19:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.458575 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.458633 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.458644 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.458661 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.458672 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:22Z","lastTransitionTime":"2025-10-07T19:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.562525 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.562583 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.562603 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.562631 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.562652 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:22Z","lastTransitionTime":"2025-10-07T19:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.665676 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.665731 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.665744 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.665764 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.665777 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:22Z","lastTransitionTime":"2025-10-07T19:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.768711 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.768762 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.768781 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.768804 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.768823 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:22Z","lastTransitionTime":"2025-10-07T19:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.871493 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.871551 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.871564 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.871583 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.871597 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:22Z","lastTransitionTime":"2025-10-07T19:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.974777 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.974837 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.974854 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.974879 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:22 crc kubenswrapper[4825]: I1007 19:01:22.974896 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:22Z","lastTransitionTime":"2025-10-07T19:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.078320 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.078380 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.078398 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.078425 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.078443 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:23Z","lastTransitionTime":"2025-10-07T19:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.181256 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.181317 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.181337 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.181363 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.181380 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:23Z","lastTransitionTime":"2025-10-07T19:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.284196 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.284286 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.284304 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.284328 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.284349 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:23Z","lastTransitionTime":"2025-10-07T19:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.386600 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.386650 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.386662 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.386679 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.386693 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:23Z","lastTransitionTime":"2025-10-07T19:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.489874 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.489947 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.489974 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.490004 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.490027 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:23Z","lastTransitionTime":"2025-10-07T19:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.592467 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.592539 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.592563 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.592588 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.592604 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:23Z","lastTransitionTime":"2025-10-07T19:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.694982 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.695019 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.695031 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.695045 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.695057 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:23Z","lastTransitionTime":"2025-10-07T19:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.794950 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.795000 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:23 crc kubenswrapper[4825]: E1007 19:01:23.795055 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:23 crc kubenswrapper[4825]: E1007 19:01:23.795135 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.795211 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:23 crc kubenswrapper[4825]: E1007 19:01:23.795278 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.795366 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:23 crc kubenswrapper[4825]: E1007 19:01:23.795419 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.796793 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.796868 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.796891 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.796918 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.796936 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:23Z","lastTransitionTime":"2025-10-07T19:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.900137 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.900326 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.900356 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.900380 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:23 crc kubenswrapper[4825]: I1007 19:01:23.900397 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:23Z","lastTransitionTime":"2025-10-07T19:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.002624 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.002663 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.002671 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.002685 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.002695 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:24Z","lastTransitionTime":"2025-10-07T19:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.104650 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.104688 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.104697 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.104713 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.104722 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:24Z","lastTransitionTime":"2025-10-07T19:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.206767 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.206811 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.206823 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.206840 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.206852 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:24Z","lastTransitionTime":"2025-10-07T19:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.309304 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.309378 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.309397 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.309423 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.309443 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:24Z","lastTransitionTime":"2025-10-07T19:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.412202 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.412299 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.412322 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.412352 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.412373 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:24Z","lastTransitionTime":"2025-10-07T19:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.515335 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.515406 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.515425 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.515453 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.515476 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:24Z","lastTransitionTime":"2025-10-07T19:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.619126 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.619170 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.619179 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.619195 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.619204 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:24Z","lastTransitionTime":"2025-10-07T19:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.721961 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.722037 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.722064 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.722094 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.722117 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:24Z","lastTransitionTime":"2025-10-07T19:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.825530 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.825588 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.825604 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.825626 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.825641 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:24Z","lastTransitionTime":"2025-10-07T19:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.928387 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.928472 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.928491 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.928519 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:24 crc kubenswrapper[4825]: I1007 19:01:24.928538 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:24Z","lastTransitionTime":"2025-10-07T19:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.031570 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.031638 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.031656 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.031681 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.031697 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:25Z","lastTransitionTime":"2025-10-07T19:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.134796 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.134863 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.134881 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.134907 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.134930 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:25Z","lastTransitionTime":"2025-10-07T19:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.237277 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.237319 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.237330 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.237349 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.237361 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:25Z","lastTransitionTime":"2025-10-07T19:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.340069 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.340124 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.340145 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.340170 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.340189 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:25Z","lastTransitionTime":"2025-10-07T19:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.443684 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.443747 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.443766 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.443798 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.443817 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:25Z","lastTransitionTime":"2025-10-07T19:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.546333 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.546401 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.546427 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.546452 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.546469 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:25Z","lastTransitionTime":"2025-10-07T19:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.654492 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.654580 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.654618 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.654647 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.654667 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:25Z","lastTransitionTime":"2025-10-07T19:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.758680 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.758757 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.758770 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.758795 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.758809 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:25Z","lastTransitionTime":"2025-10-07T19:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.794692 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:25 crc kubenswrapper[4825]: E1007 19:01:25.794888 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.794916 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.794945 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.794959 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:25 crc kubenswrapper[4825]: E1007 19:01:25.795085 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:25 crc kubenswrapper[4825]: E1007 19:01:25.795313 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:25 crc kubenswrapper[4825]: E1007 19:01:25.795504 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.861693 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.861755 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.861795 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.861831 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.861854 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:25Z","lastTransitionTime":"2025-10-07T19:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.964292 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.964365 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.964401 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.964430 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:25 crc kubenswrapper[4825]: I1007 19:01:25.964450 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:25Z","lastTransitionTime":"2025-10-07T19:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.067321 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.067385 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.067414 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.067437 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.067455 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:26Z","lastTransitionTime":"2025-10-07T19:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.170356 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.170424 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.170444 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.170472 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.170494 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:26Z","lastTransitionTime":"2025-10-07T19:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.273209 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.273271 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.273296 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.273315 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.273329 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:26Z","lastTransitionTime":"2025-10-07T19:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.376704 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.376771 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.376789 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.376813 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.376830 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:26Z","lastTransitionTime":"2025-10-07T19:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.480458 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.480542 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.480573 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.480607 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.480632 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:26Z","lastTransitionTime":"2025-10-07T19:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.583987 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.584039 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.584056 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.584079 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.584100 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:26Z","lastTransitionTime":"2025-10-07T19:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.687476 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.687527 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.687543 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.687565 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.687587 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:26Z","lastTransitionTime":"2025-10-07T19:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.790194 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.790329 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.790349 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.790380 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.790399 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:26Z","lastTransitionTime":"2025-10-07T19:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.893610 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.893678 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.893714 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.893740 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.893759 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:26Z","lastTransitionTime":"2025-10-07T19:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.997522 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.997669 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.997695 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.997730 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:26 crc kubenswrapper[4825]: I1007 19:01:26.997752 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:26Z","lastTransitionTime":"2025-10-07T19:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.100548 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.100619 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.100641 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.100676 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.100699 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:27Z","lastTransitionTime":"2025-10-07T19:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.203222 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.203324 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.203350 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.203381 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.203405 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:27Z","lastTransitionTime":"2025-10-07T19:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.305619 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.305695 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.305730 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.305761 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.305784 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:27Z","lastTransitionTime":"2025-10-07T19:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.408682 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.408743 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.408762 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.408786 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.408804 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:27Z","lastTransitionTime":"2025-10-07T19:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.512498 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.512575 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.512601 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.512634 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.512658 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:27Z","lastTransitionTime":"2025-10-07T19:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.615899 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.615973 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.615995 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.616024 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.616044 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:27Z","lastTransitionTime":"2025-10-07T19:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.719638 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.719696 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.719706 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.719723 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.719734 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:27Z","lastTransitionTime":"2025-10-07T19:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.795006 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.795179 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.795216 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:27 crc kubenswrapper[4825]: E1007 19:01:27.795211 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.795268 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:27 crc kubenswrapper[4825]: E1007 19:01:27.795522 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:27 crc kubenswrapper[4825]: E1007 19:01:27.795643 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:27 crc kubenswrapper[4825]: E1007 19:01:27.795740 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.822428 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.822487 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.822503 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.822523 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.822537 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:27Z","lastTransitionTime":"2025-10-07T19:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.926536 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.926628 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.926650 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.926681 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:27 crc kubenswrapper[4825]: I1007 19:01:27.926714 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:27Z","lastTransitionTime":"2025-10-07T19:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.029551 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.029600 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.029656 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.029681 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.029698 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:28Z","lastTransitionTime":"2025-10-07T19:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.133754 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.133811 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.133820 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.133837 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.133847 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:28Z","lastTransitionTime":"2025-10-07T19:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.236999 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.237055 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.237066 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.237082 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.237093 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:28Z","lastTransitionTime":"2025-10-07T19:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.340098 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.340158 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.340173 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.340194 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.340213 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:28Z","lastTransitionTime":"2025-10-07T19:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.442444 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.442497 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.442509 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.442526 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.442537 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:28Z","lastTransitionTime":"2025-10-07T19:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.546371 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.546416 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.546428 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.546456 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.546467 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:28Z","lastTransitionTime":"2025-10-07T19:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.649501 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.649586 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.649605 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.649631 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.649651 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:28Z","lastTransitionTime":"2025-10-07T19:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.753335 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.753398 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.753416 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.753441 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.753459 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:28Z","lastTransitionTime":"2025-10-07T19:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.856201 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.856287 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.856305 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.856334 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.856356 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:28Z","lastTransitionTime":"2025-10-07T19:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.958800 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.958859 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.958876 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.958899 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:28 crc kubenswrapper[4825]: I1007 19:01:28.958916 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:28Z","lastTransitionTime":"2025-10-07T19:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.062550 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.062614 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.062630 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.062659 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.062677 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:29Z","lastTransitionTime":"2025-10-07T19:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.165869 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.165908 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.165918 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.165936 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.165948 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:29Z","lastTransitionTime":"2025-10-07T19:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.268511 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.268564 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.268577 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.268597 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.268611 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:29Z","lastTransitionTime":"2025-10-07T19:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.371953 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.372028 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.372053 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.372085 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.372105 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:29Z","lastTransitionTime":"2025-10-07T19:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.475375 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.475443 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.475455 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.475474 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.475489 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:29Z","lastTransitionTime":"2025-10-07T19:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.486133 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.486198 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.486221 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.486280 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.486309 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:29Z","lastTransitionTime":"2025-10-07T19:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:29 crc kubenswrapper[4825]: E1007 19:01:29.502803 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:29Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.509895 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.509955 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.509972 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.509997 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.510012 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:29Z","lastTransitionTime":"2025-10-07T19:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:29 crc kubenswrapper[4825]: E1007 19:01:29.528997 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:29Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.534012 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.534060 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.534072 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.534090 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.534104 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:29Z","lastTransitionTime":"2025-10-07T19:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:29 crc kubenswrapper[4825]: E1007 19:01:29.548909 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:29Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.553817 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.553886 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.553896 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.553919 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.553932 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:29Z","lastTransitionTime":"2025-10-07T19:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:29 crc kubenswrapper[4825]: E1007 19:01:29.571152 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:29Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.576054 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.576117 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.576127 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.576149 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.576178 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:29Z","lastTransitionTime":"2025-10-07T19:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:29 crc kubenswrapper[4825]: E1007 19:01:29.598358 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"951f58e0-4df3-42e3-a827-d82d183370bf\\\",\\\"systemUUID\\\":\\\"da8b2757-4bf3-4b55-84bb-69d70219b543\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:29Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:29 crc kubenswrapper[4825]: E1007 19:01:29.598502 4825 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.600979 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.601033 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.601051 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.601077 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.601093 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:29Z","lastTransitionTime":"2025-10-07T19:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.704523 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.704600 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.704611 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.704632 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.704663 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:29Z","lastTransitionTime":"2025-10-07T19:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.794784 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:29 crc kubenswrapper[4825]: E1007 19:01:29.795094 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.795164 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.795175 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.795191 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:29 crc kubenswrapper[4825]: E1007 19:01:29.795513 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:29 crc kubenswrapper[4825]: E1007 19:01:29.795546 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:29 crc kubenswrapper[4825]: E1007 19:01:29.795749 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.808025 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.808070 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.808081 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.808098 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.808110 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:29Z","lastTransitionTime":"2025-10-07T19:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.811878 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.910572 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.910622 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.910631 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.910647 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:29 crc kubenswrapper[4825]: I1007 19:01:29.910663 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:29Z","lastTransitionTime":"2025-10-07T19:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.014031 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.014108 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.014132 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.014163 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.014185 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:30Z","lastTransitionTime":"2025-10-07T19:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.117146 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.117217 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.117538 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.117566 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.117870 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:30Z","lastTransitionTime":"2025-10-07T19:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.221559 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.221622 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.221638 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.221659 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.221677 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:30Z","lastTransitionTime":"2025-10-07T19:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.324202 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.324296 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.324316 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.324340 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.324377 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:30Z","lastTransitionTime":"2025-10-07T19:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.427373 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.427421 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.427432 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.427451 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.427463 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:30Z","lastTransitionTime":"2025-10-07T19:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.530248 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.530300 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.530312 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.530330 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.530343 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:30Z","lastTransitionTime":"2025-10-07T19:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.633976 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.634044 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.634062 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.634088 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.634106 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:30Z","lastTransitionTime":"2025-10-07T19:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.737650 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.737717 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.737740 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.737766 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.737787 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:30Z","lastTransitionTime":"2025-10-07T19:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.840654 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.840830 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.840854 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.840879 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.840895 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:30Z","lastTransitionTime":"2025-10-07T19:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.944707 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.944776 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.944788 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.944828 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:30 crc kubenswrapper[4825]: I1007 19:01:30.944841 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:30Z","lastTransitionTime":"2025-10-07T19:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.047534 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.047618 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.047636 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.047664 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.047688 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:31Z","lastTransitionTime":"2025-10-07T19:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.151429 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.151512 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.151537 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.151566 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.151589 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:31Z","lastTransitionTime":"2025-10-07T19:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.255506 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.255569 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.255585 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.255657 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.255678 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:31Z","lastTransitionTime":"2025-10-07T19:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.359580 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.359689 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.360279 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.360371 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.360755 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:31Z","lastTransitionTime":"2025-10-07T19:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.464396 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.464454 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.464473 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.464497 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.464514 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:31Z","lastTransitionTime":"2025-10-07T19:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.567541 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.567603 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.567620 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.567646 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.567666 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:31Z","lastTransitionTime":"2025-10-07T19:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.670727 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.670786 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.670797 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.670815 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.670827 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:31Z","lastTransitionTime":"2025-10-07T19:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.774137 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.774215 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.774278 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.774312 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.774336 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:31Z","lastTransitionTime":"2025-10-07T19:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.794827 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:31 crc kubenswrapper[4825]: E1007 19:01:31.795058 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.795344 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.800566 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.800620 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:31 crc kubenswrapper[4825]: E1007 19:01:31.800852 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:31 crc kubenswrapper[4825]: E1007 19:01:31.801321 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:31 crc kubenswrapper[4825]: E1007 19:01:31.801661 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.802489 4825 scope.go:117] "RemoveContainer" containerID="80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.820422 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e5cbd6853b21641655497f3c250645e7ea086a9dfe7d7e6b941b1cdabc5953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"2025-10-07T19:00:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_357741bc-6702-4d9e-94ce-a129a4f98853\\\\n2025-10-07T19:00:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_357741bc-6702-4d9e-94ce-a129a4f98853 to /host/opt/cni/bin/\\\\n2025-10-07T19:00:34Z [verbose] multus-daemon started\\\\n2025-10-07T19:00:34Z [verbose] Readiness Indicator file check\\\\n2025-10-07T19:01:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:31Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.833941 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3c122804372bb1842a362067d274a1debd124b633605dbf43d21d52688ef96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:31Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.854416 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:31Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.877607 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://118b6f70278ab0bc5e10ad653b675b5790a88df552124be3fe509514c6d59a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:31Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.878983 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.879587 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.879616 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.879638 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.879651 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:31Z","lastTransitionTime":"2025-10-07T19:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.889249 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:31Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.903071 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bvwh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee9b984f-baa3-429f-b929-3d61d5e204bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bvwh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:31Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.917092 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:31Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.938528 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:31Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.950659 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:31Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.968367 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f30cd95-eb57-436d-bb25-5d14cc087820\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60f9716f9ac83aeb270019e1e2dfdc6d4aa8307f40949aeb39a95dd2134cc9cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://637e969a9a7909f0fc3e029f8bcf47c0c004ce9089ec75c8cc44adcdf333b1dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57ebc28bef30bc9400af5461cb62e963762d349457aada53e6d1e9d8777b0d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f83acfadc30a936b58da7008de9f678cdef4b6ab6650920b800b0bb14541490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f83acfadc30a936b58da7008de9f678cdef4b6ab6650920b800b0bb14541490\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:31Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.982763 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.983047 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.983130 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.983082 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:31Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.983202 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.983381 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:31Z","lastTransitionTime":"2025-10-07T19:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:31 crc kubenswrapper[4825]: I1007 19:01:31.997305 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:31Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.010964 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fdc6a60-c244-4f96-986c-e0cc5a38e110\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa7c70a9c7a88be1caa194e5ddab1f65f60518ca17d860a88ed660a9d033f758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://679ef9cbf4a8629dbdde9fdc019f83f9a2c7547f01e94274b597491322b6fd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://679ef9cbf4a8629dbdde9fdc019f83f9a2c7547f01e94274b597491322b6fd50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:32Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.027735 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:32Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.044931 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:32Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.067698 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:32Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.086152 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.086196 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.086204 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.086222 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.086243 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:32Z","lastTransitionTime":"2025-10-07T19:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.090937 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:01:01Z\\\",\\\"message\\\":\\\"ault network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z]\\\\nI1007 19:01:01.934399 6436 services_controller.go:451] Built service openshift-config-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-config-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-config-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.161\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]servi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:01:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6lvdm_openshift-ovn-kubernetes(11546b62-cdda-449d-963e-418c2d4b6e46)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:32Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.108119 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2d0969fc24a0da54619501d29224a835772e85dee07940dee63ec5554f9891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82749148befd799cd1962c8be8688b1dd154b1481de391a25a399f2f2e640bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5c4jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:32Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.125938 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:32Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.188975 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.189034 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.189046 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.189084 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.189097 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:32Z","lastTransitionTime":"2025-10-07T19:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.291868 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.291917 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.291930 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.291947 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.291957 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:32Z","lastTransitionTime":"2025-10-07T19:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.395539 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.395603 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.395643 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.395681 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.395707 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:32Z","lastTransitionTime":"2025-10-07T19:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.498351 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.498412 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.498426 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.498447 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.498460 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:32Z","lastTransitionTime":"2025-10-07T19:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.600926 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.601004 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.601028 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.601059 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.601077 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:32Z","lastTransitionTime":"2025-10-07T19:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.704383 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.704469 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.704491 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.704524 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.704546 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:32Z","lastTransitionTime":"2025-10-07T19:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.807376 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.807424 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.807437 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.807457 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.807472 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:32Z","lastTransitionTime":"2025-10-07T19:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.910599 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.910646 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.910659 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.910676 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:32 crc kubenswrapper[4825]: I1007 19:01:32.910690 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:32Z","lastTransitionTime":"2025-10-07T19:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.014050 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.014091 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.014104 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.014120 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.014131 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:33Z","lastTransitionTime":"2025-10-07T19:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.118273 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.118318 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.118328 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.118343 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.118355 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:33Z","lastTransitionTime":"2025-10-07T19:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.221537 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.221595 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.221611 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.221635 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.221652 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:33Z","lastTransitionTime":"2025-10-07T19:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.294992 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6lvdm_11546b62-cdda-449d-963e-418c2d4b6e46/ovnkube-controller/2.log" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.298582 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerStarted","Data":"4f77669353aaa0deb54b8519f6c7a7734f5a44001abcf2bb19baa55fd5c050ff"} Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.299616 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.317572 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fdc6a60-c244-4f96-986c-e0cc5a38e110\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa7c70a9c7a88be1caa194e5ddab1f65f60518ca17d860a88ed660a9d033f758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://679ef9cbf4a8629dbdde9fdc019f83f9a2c7547f01e94274b597491322b6fd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://679ef9cbf4a8629dbdde9fdc019f83f9a2c7547f01e94274b597491322b6fd50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.323884 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.323941 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.323959 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.323982 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.324000 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:33Z","lastTransitionTime":"2025-10-07T19:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.352746 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d5790e3-7445-438e-b42a-c6211321f946\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da8f0fb9bf8168fd8af01d6cc3e5609f1a1ba62f0423270419b76bbd44afcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5dd98c87f274ee840586e43c8e7bba37d1410284492720e6d01378006cb4f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f8e3fc704ebabe20ca895748d87ee9e6b04639c7825d3697a8796b88f54e91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d323767fd6147f0d256dcc27e574c63dd03e69d589f9c40e1f95ec0d8f4427be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2e4e62a52d01aa10988bf37456126f2b5d366397fcfbcd8d0e45a5116a55b96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c4b21ef4c1776d482b38ece0a0bb86d02f723d3cbda612a3cc8b1a52f2f1f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a23acf257094723033f5e6543b839d71902af00f39a67ad42355f638e74665c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://395c759e65e5f8cfe785344b829fda0093ee15b4df0754f6fec656731355f970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.371604 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8260e302-4fb7-47ed-8381-9f4bfd827919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://483cc62ab6b64a27a0e910265666b4a1f15d9c92da0e246536f71858a9b6b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a08a2e2cd0625d48fe9d7b5b9518626de526af1df48de4c416a21bc8d599d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ed8418d1644b02d7e9b9f6208ed443d41c1b3f8c7262b57d6edf42265b2efdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e0993113763e665bd70ae2a3b5bda950d4f84e588ce2b94d6add879d125b4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.385534 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f30cd95-eb57-436d-bb25-5d14cc087820\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60f9716f9ac83aeb270019e1e2dfdc6d4aa8307f40949aeb39a95dd2134cc9cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://637e969a9a7909f0fc3e029f8bcf47c0c004ce9089ec75c8cc44adcdf333b1dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57ebc28bef30bc9400af5461cb62e963762d349457aada53e6d1e9d8777b0d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f83acfadc30a936b58da7008de9f678cdef4b6ab6650920b800b0bb14541490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f83acfadc30a936b58da7008de9f678cdef4b6ab6650920b800b0bb14541490\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.401637 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.414840 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacb7f0705029f43e8142e3c65d7d559e6bc89dfab5b1b2df2bdce95bf7f5638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.425958 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.426028 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.426049 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.426076 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.426098 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:33Z","lastTransitionTime":"2025-10-07T19:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.435864 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996d6ee3-832b-4090-b15b-efea61174d29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799b7a30be83e06bbcc5a803be916ab4ff74df49d6f7610c009d299db07842eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593603a4beaf30496512043d60487ce06088b78dca8c7f8eeebec1c96c359ef9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12f4a091f6e3d3cffa9c5b19a820d9128d827fe2eda7cd6824e52b4fa4471be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbf8b7e0dda550e2b7198a6843acdb6d91526787a9460a48e82b4e2f630a68ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3bb297483c21966ae58e7d5323a2b98ffe1f056e2a346bb5a891e44510f6ef1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 19:00:25.359641 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 19:00:25.363485 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1838340145/tls.crt::/tmp/serving-cert-1838340145/tls.key\\\\\\\"\\\\nI1007 19:00:31.331562 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 19:00:31.338427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 19:00:31.338465 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 19:00:31.338509 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 19:00:31.338525 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 19:00:31.347132 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 19:00:31.347195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347206 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 19:00:31.347218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 19:00:31.347263 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 19:00:31.347272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 19:00:31.347280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 19:00:31.347769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 19:00:31.368180 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8a04e59cc51611f139e3a255fecf7bea859629ce778f685344b95b6f49319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15e669871a8beb03e90659da29466b43efd7791bf81c6f4fe68461d928705f26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.452444 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.466767 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37063f57407fb67350d43af6cd0dcdfbf8578b3cecf64b6766bb06459cf1b218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd5d99c1243b8a3e4d524284f5818b2cc7410adb3865fc3e43660fd2faa60b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.481586 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b548a634a4fe066b4a971bc9fd5ce80f70036b5f00d9ae4386153fce12c9bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.505116 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11546b62-cdda-449d-963e-418c2d4b6e46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f77669353aaa0deb54b8519f6c7a7734f5a44001abcf2bb19baa55fd5c050ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:01:01Z\\\",\\\"message\\\":\\\"ault network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:01Z is after 2025-08-24T17:21:41Z]\\\\nI1007 19:01:01.934399 6436 services_controller.go:451] Built service openshift-config-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-config-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-config-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.161\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]servi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:01:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmmv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6lvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.519559 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d90e25a-d8b6-4a4c-9948-c8ea3b38996c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2d0969fc24a0da54619501d29224a835772e85dee07940dee63ec5554f9891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82749148befd799cd1962c8be8688b1dd154b1481de391a25a399f2f2e640bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxdpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5c4jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.528659 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.528716 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.528729 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.528748 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.528760 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:33Z","lastTransitionTime":"2025-10-07T19:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.538447 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.553799 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zk9x9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44f62e96-26a6-4bfe-8e8c-6884216bd363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e5cbd6853b21641655497f3c250645e7ea086a9dfe7d7e6b941b1cdabc5953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T19:01:19Z\\\",\\\"message\\\":\\\"2025-10-07T19:00:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_357741bc-6702-4d9e-94ce-a129a4f98853\\\\n2025-10-07T19:00:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_357741bc-6702-4d9e-94ce-a129a4f98853 to /host/opt/cni/bin/\\\\n2025-10-07T19:00:34Z [verbose] multus-daemon started\\\\n2025-10-07T19:00:34Z [verbose] Readiness Indicator file check\\\\n2025-10-07T19:01:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2gzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zk9x9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.569655 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtrsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0366d9-864d-4de0-8482-9d0a061fcd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3c122804372bb1842a362067d274a1debd124b633605dbf43d21d52688ef96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgzvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtrsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.582295 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f038b04-14c9-421c-91e9-ab654b6c4ac8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33a8479acaa496914e4cdcec3509bc6eac7b68336baf6f63b9a7de6abb9fbac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4krj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.601806 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e48a4135-d1b9-4dfb-89fc-be393f7937aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://118b6f70278ab0bc5e10ad653b675b5790a88df552124be3fe509514c6d59a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9c4c8f4c40bd8ecd2d2e8ab9efb82aa0857bbaea2cbd427267542b841a0c31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62ab69ef71ce4c7fe4e469a3b409a3f365551d865ec05e55b5291729610937a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4289d1668f240f473ad6d8e51922d02e42cdd1941acca7ba26e7a4b9b125312\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ff38f1860510aff5fa9ff215b85d117bc26a83da8874f3423f3898e4cda471b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6e563f8f447182134f41b0f7e13adb1c7ec6298d168d5e30ab8356fdf9b560\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bc052428d4c8c39aba293baf5598001cbe604514414493e24533cdd219392a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T19:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T19:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwfhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6bwfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.619902 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ec9907a1140a7945131e8cbd1e14af9855b7b172b99de6b570dd249651633de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T19:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wjlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b6jcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.632357 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.632439 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.632469 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.632501 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.632527 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:33Z","lastTransitionTime":"2025-10-07T19:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.637115 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bvwh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee9b984f-baa3-429f-b929-3d61d5e204bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T19:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T19:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bvwh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T19:01:33Z is after 2025-08-24T17:21:41Z" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.736201 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.736308 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.736328 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.736354 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.736371 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:33Z","lastTransitionTime":"2025-10-07T19:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.795414 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.795516 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.795556 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.795619 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:33 crc kubenswrapper[4825]: E1007 19:01:33.796084 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:33 crc kubenswrapper[4825]: E1007 19:01:33.796305 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:33 crc kubenswrapper[4825]: E1007 19:01:33.796341 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:33 crc kubenswrapper[4825]: E1007 19:01:33.796478 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.839757 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.839809 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.839826 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.839848 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.839864 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:33Z","lastTransitionTime":"2025-10-07T19:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.942887 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.942998 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.943013 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.943031 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:33 crc kubenswrapper[4825]: I1007 19:01:33.943054 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:33Z","lastTransitionTime":"2025-10-07T19:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.046417 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.046462 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.046474 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.046489 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.046505 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:34Z","lastTransitionTime":"2025-10-07T19:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.149537 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.149604 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.149623 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.149648 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.149665 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:34Z","lastTransitionTime":"2025-10-07T19:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.252634 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.252707 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.252729 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.252757 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.252776 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:34Z","lastTransitionTime":"2025-10-07T19:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.305445 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6lvdm_11546b62-cdda-449d-963e-418c2d4b6e46/ovnkube-controller/3.log" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.306598 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6lvdm_11546b62-cdda-449d-963e-418c2d4b6e46/ovnkube-controller/2.log" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.311111 4825 generic.go:334] "Generic (PLEG): container finished" podID="11546b62-cdda-449d-963e-418c2d4b6e46" containerID="4f77669353aaa0deb54b8519f6c7a7734f5a44001abcf2bb19baa55fd5c050ff" exitCode=1 Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.311170 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerDied","Data":"4f77669353aaa0deb54b8519f6c7a7734f5a44001abcf2bb19baa55fd5c050ff"} Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.311221 4825 scope.go:117] "RemoveContainer" containerID="80e2302aac4d1ce503e59410f7dc92462d52b99230aa283f8134b4e364f1a35f" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.312448 4825 scope.go:117] "RemoveContainer" containerID="4f77669353aaa0deb54b8519f6c7a7734f5a44001abcf2bb19baa55fd5c050ff" Oct 07 19:01:34 crc kubenswrapper[4825]: E1007 19:01:34.312772 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6lvdm_openshift-ovn-kubernetes(11546b62-cdda-449d-963e-418c2d4b6e46)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.356266 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.356304 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.356317 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.356335 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.356349 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:34Z","lastTransitionTime":"2025-10-07T19:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.373916 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zk9x9" podStartSLOduration=63.373896403 podStartE2EDuration="1m3.373896403s" podCreationTimestamp="2025-10-07 19:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:01:34.362702155 +0000 UTC m=+83.184740802" watchObservedRunningTime="2025-10-07 19:01:34.373896403 +0000 UTC m=+83.195935040" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.385121 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vtrsb" podStartSLOduration=62.3850943 podStartE2EDuration="1m2.3850943s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:01:34.374038147 +0000 UTC m=+83.196076824" watchObservedRunningTime="2025-10-07 19:01:34.3850943 +0000 UTC m=+83.207132977" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.407466 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6bwfw" podStartSLOduration=63.407442504 podStartE2EDuration="1m3.407442504s" podCreationTimestamp="2025-10-07 19:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:01:34.407168855 +0000 UTC m=+83.229207512" watchObservedRunningTime="2025-10-07 19:01:34.407442504 +0000 UTC m=+83.229481141" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.407655 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-xvdcs" podStartSLOduration=63.40765007 podStartE2EDuration="1m3.40765007s" podCreationTimestamp="2025-10-07 19:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:01:34.385418691 +0000 UTC m=+83.207457408" watchObservedRunningTime="2025-10-07 19:01:34.40765007 +0000 UTC m=+83.229688707" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.428506 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podStartSLOduration=63.428489786 podStartE2EDuration="1m3.428489786s" podCreationTimestamp="2025-10-07 19:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:01:34.428180216 +0000 UTC m=+83.250218893" watchObservedRunningTime="2025-10-07 19:01:34.428489786 +0000 UTC m=+83.250528423" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.452724 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=5.452703959 podStartE2EDuration="5.452703959s" podCreationTimestamp="2025-10-07 19:01:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:01:34.45244226 +0000 UTC m=+83.274480907" watchObservedRunningTime="2025-10-07 19:01:34.452703959 +0000 UTC m=+83.274742596" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.458846 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.458914 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.458935 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.458961 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.458979 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:34Z","lastTransitionTime":"2025-10-07T19:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.481288 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=59.481267221 podStartE2EDuration="59.481267221s" podCreationTimestamp="2025-10-07 19:00:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:01:34.480540768 +0000 UTC m=+83.302579405" watchObservedRunningTime="2025-10-07 19:01:34.481267221 +0000 UTC m=+83.303305898" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.499283 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=58.499256955999996 podStartE2EDuration="58.499256956s" podCreationTimestamp="2025-10-07 19:00:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:01:34.499115041 +0000 UTC m=+83.321153728" watchObservedRunningTime="2025-10-07 19:01:34.499256956 +0000 UTC m=+83.321295603" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.514540 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=32.514515983 podStartE2EDuration="32.514515983s" podCreationTimestamp="2025-10-07 19:01:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:01:34.513605333 +0000 UTC m=+83.335643980" watchObservedRunningTime="2025-10-07 19:01:34.514515983 +0000 UTC m=+83.336554630" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.561678 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.561730 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.561744 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.561766 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.561782 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:34Z","lastTransitionTime":"2025-10-07T19:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.565514 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=63.56548876 podStartE2EDuration="1m3.56548876s" podCreationTimestamp="2025-10-07 19:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:01:34.564948323 +0000 UTC m=+83.386986980" watchObservedRunningTime="2025-10-07 19:01:34.56548876 +0000 UTC m=+83.387527437" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.657278 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5c4jr" podStartSLOduration=62.65725245 podStartE2EDuration="1m2.65725245s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:01:34.655783503 +0000 UTC m=+83.477822140" watchObservedRunningTime="2025-10-07 19:01:34.65725245 +0000 UTC m=+83.479291097" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.663903 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.663953 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.663969 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.663987 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.664002 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:34Z","lastTransitionTime":"2025-10-07T19:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.767313 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.767393 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.767416 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.767450 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.767472 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:34Z","lastTransitionTime":"2025-10-07T19:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.871067 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.871131 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.871150 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.871179 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.871196 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:34Z","lastTransitionTime":"2025-10-07T19:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.974212 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.974291 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.974309 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.974341 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:34 crc kubenswrapper[4825]: I1007 19:01:34.974358 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:34Z","lastTransitionTime":"2025-10-07T19:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.077852 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.077922 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.077941 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.077969 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.077986 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:35Z","lastTransitionTime":"2025-10-07T19:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.181398 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.181459 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.181477 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.181504 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.181522 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:35Z","lastTransitionTime":"2025-10-07T19:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.284695 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.284770 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.284787 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.284812 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.284830 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:35Z","lastTransitionTime":"2025-10-07T19:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.316774 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6lvdm_11546b62-cdda-449d-963e-418c2d4b6e46/ovnkube-controller/3.log" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.321569 4825 scope.go:117] "RemoveContainer" containerID="4f77669353aaa0deb54b8519f6c7a7734f5a44001abcf2bb19baa55fd5c050ff" Oct 07 19:01:35 crc kubenswrapper[4825]: E1007 19:01:35.321889 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6lvdm_openshift-ovn-kubernetes(11546b62-cdda-449d-963e-418c2d4b6e46)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.387931 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.387979 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.387989 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.388004 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.388015 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:35Z","lastTransitionTime":"2025-10-07T19:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.490924 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.490993 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.491012 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.491038 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.491057 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:35Z","lastTransitionTime":"2025-10-07T19:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.593991 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.594052 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.594070 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.594099 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.594120 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:35Z","lastTransitionTime":"2025-10-07T19:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.697425 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.697496 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.697513 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.697537 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.697554 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:35Z","lastTransitionTime":"2025-10-07T19:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.795372 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.795422 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.795513 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.795519 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:35 crc kubenswrapper[4825]: E1007 19:01:35.795632 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:35 crc kubenswrapper[4825]: E1007 19:01:35.795805 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:35 crc kubenswrapper[4825]: E1007 19:01:35.795940 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:35 crc kubenswrapper[4825]: E1007 19:01:35.796033 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.800467 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.800523 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.800540 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.800563 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.800581 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:35Z","lastTransitionTime":"2025-10-07T19:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.826278 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:01:35 crc kubenswrapper[4825]: E1007 19:01:35.826488 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:39.826453041 +0000 UTC m=+148.648491678 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.826560 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.826752 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:35 crc kubenswrapper[4825]: E1007 19:01:35.826751 4825 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 19:01:35 crc kubenswrapper[4825]: E1007 19:01:35.826924 4825 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 19:01:35 crc kubenswrapper[4825]: E1007 19:01:35.826943 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 19:02:39.826926335 +0000 UTC m=+148.648965192 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 19:01:35 crc kubenswrapper[4825]: E1007 19:01:35.827044 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 19:02:39.827022078 +0000 UTC m=+148.649060725 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.903746 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.903811 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.903832 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.903860 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.903878 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:35Z","lastTransitionTime":"2025-10-07T19:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.927721 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:35 crc kubenswrapper[4825]: I1007 19:01:35.927820 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:35 crc kubenswrapper[4825]: E1007 19:01:35.927952 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 19:01:35 crc kubenswrapper[4825]: E1007 19:01:35.927996 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 19:01:35 crc kubenswrapper[4825]: E1007 19:01:35.928018 4825 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:01:35 crc kubenswrapper[4825]: E1007 19:01:35.928022 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 19:01:35 crc kubenswrapper[4825]: E1007 19:01:35.928052 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 19:01:35 crc kubenswrapper[4825]: E1007 19:01:35.928070 4825 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:01:35 crc kubenswrapper[4825]: E1007 19:01:35.928110 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 19:02:39.928083175 +0000 UTC m=+148.750121852 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:01:35 crc kubenswrapper[4825]: E1007 19:01:35.928149 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 19:02:39.928135357 +0000 UTC m=+148.750174034 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.006074 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.006135 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.006152 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.006176 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.006193 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:36Z","lastTransitionTime":"2025-10-07T19:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.109075 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.109163 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.109187 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.109217 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.109273 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:36Z","lastTransitionTime":"2025-10-07T19:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.213020 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.213085 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.213102 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.213127 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.213146 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:36Z","lastTransitionTime":"2025-10-07T19:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.315564 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.315624 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.315646 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.315676 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.315698 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:36Z","lastTransitionTime":"2025-10-07T19:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.419064 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.419130 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.419148 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.419174 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.419193 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:36Z","lastTransitionTime":"2025-10-07T19:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.522403 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.522484 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.522512 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.522538 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.522557 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:36Z","lastTransitionTime":"2025-10-07T19:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.625182 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.625274 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.625295 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.625321 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.625342 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:36Z","lastTransitionTime":"2025-10-07T19:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.728713 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.728783 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.728808 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.728833 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.728850 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:36Z","lastTransitionTime":"2025-10-07T19:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.835830 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.836341 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.836360 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.836422 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.836440 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:36Z","lastTransitionTime":"2025-10-07T19:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.938609 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.938655 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.938674 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.938697 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:36 crc kubenswrapper[4825]: I1007 19:01:36.938716 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:36Z","lastTransitionTime":"2025-10-07T19:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.042110 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.042178 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.042197 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.042255 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.042284 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:37Z","lastTransitionTime":"2025-10-07T19:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.145607 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.145677 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.145882 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.145913 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.145935 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:37Z","lastTransitionTime":"2025-10-07T19:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.248497 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.248546 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.248557 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.248576 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.248588 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:37Z","lastTransitionTime":"2025-10-07T19:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.351527 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.351581 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.351601 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.351630 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.351655 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:37Z","lastTransitionTime":"2025-10-07T19:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.455335 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.455394 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.455413 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.455435 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.455479 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:37Z","lastTransitionTime":"2025-10-07T19:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.558087 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.558155 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.558177 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.558203 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.558220 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:37Z","lastTransitionTime":"2025-10-07T19:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.661758 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.661829 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.661846 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.661871 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.661892 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:37Z","lastTransitionTime":"2025-10-07T19:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.765716 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.765784 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.765801 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.765826 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.765843 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:37Z","lastTransitionTime":"2025-10-07T19:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.795534 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.795618 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:37 crc kubenswrapper[4825]: E1007 19:01:37.795804 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.795835 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:37 crc kubenswrapper[4825]: E1007 19:01:37.795980 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.795534 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:37 crc kubenswrapper[4825]: E1007 19:01:37.796126 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:37 crc kubenswrapper[4825]: E1007 19:01:37.796309 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.869010 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.869068 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.869081 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.869103 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.869117 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:37Z","lastTransitionTime":"2025-10-07T19:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.972301 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.972433 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.972490 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.972522 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:37 crc kubenswrapper[4825]: I1007 19:01:37.972539 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:37Z","lastTransitionTime":"2025-10-07T19:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.075953 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.076018 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.076035 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.076060 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.076078 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:38Z","lastTransitionTime":"2025-10-07T19:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.179732 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.179795 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.179813 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.179868 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.179887 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:38Z","lastTransitionTime":"2025-10-07T19:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.282950 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.283016 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.283034 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.283059 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.283078 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:38Z","lastTransitionTime":"2025-10-07T19:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.386213 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.386297 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.386311 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.386334 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.386350 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:38Z","lastTransitionTime":"2025-10-07T19:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.489220 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.489306 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.489342 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.489375 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.489397 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:38Z","lastTransitionTime":"2025-10-07T19:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.591861 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.591913 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.591934 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.591958 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.591976 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:38Z","lastTransitionTime":"2025-10-07T19:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.695473 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.695538 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.695558 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.695582 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.695604 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:38Z","lastTransitionTime":"2025-10-07T19:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.798682 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.798747 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.798770 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.798802 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.798825 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:38Z","lastTransitionTime":"2025-10-07T19:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.901693 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.901754 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.901771 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.901798 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:38 crc kubenswrapper[4825]: I1007 19:01:38.901819 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:38Z","lastTransitionTime":"2025-10-07T19:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.004545 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.004604 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.004621 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.004649 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.004668 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:39Z","lastTransitionTime":"2025-10-07T19:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.107630 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.107707 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.107729 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.107763 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.107788 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:39Z","lastTransitionTime":"2025-10-07T19:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.210164 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.210273 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.210306 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.210337 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.210359 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:39Z","lastTransitionTime":"2025-10-07T19:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.312580 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.312655 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.312680 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.312710 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.312734 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:39Z","lastTransitionTime":"2025-10-07T19:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.415473 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.415517 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.415527 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.415542 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.415553 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:39Z","lastTransitionTime":"2025-10-07T19:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.519287 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.519366 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.519385 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.519412 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.519436 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:39Z","lastTransitionTime":"2025-10-07T19:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.622605 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.622716 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.622741 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.622772 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.622796 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:39Z","lastTransitionTime":"2025-10-07T19:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.726206 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.726385 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.726412 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.726438 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.726456 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:39Z","lastTransitionTime":"2025-10-07T19:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.794885 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.794968 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.794993 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:39 crc kubenswrapper[4825]: E1007 19:01:39.795060 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:39 crc kubenswrapper[4825]: E1007 19:01:39.795148 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.795199 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:39 crc kubenswrapper[4825]: E1007 19:01:39.795404 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:39 crc kubenswrapper[4825]: E1007 19:01:39.795497 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.807568 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.807626 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.807644 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.807667 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.807686 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:39Z","lastTransitionTime":"2025-10-07T19:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.832848 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.832911 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.832930 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.832955 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.832979 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T19:01:39Z","lastTransitionTime":"2025-10-07T19:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.871046 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-hf6kp"] Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.871674 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hf6kp" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.874438 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.874439 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.876090 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.876168 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.974930 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/974715c1-971d-48d0-8f4a-fff1a61aecee-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hf6kp\" (UID: \"974715c1-971d-48d0-8f4a-fff1a61aecee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hf6kp" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.974991 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/974715c1-971d-48d0-8f4a-fff1a61aecee-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hf6kp\" (UID: \"974715c1-971d-48d0-8f4a-fff1a61aecee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hf6kp" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.975039 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/974715c1-971d-48d0-8f4a-fff1a61aecee-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hf6kp\" (UID: \"974715c1-971d-48d0-8f4a-fff1a61aecee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hf6kp" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.975062 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/974715c1-971d-48d0-8f4a-fff1a61aecee-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hf6kp\" (UID: \"974715c1-971d-48d0-8f4a-fff1a61aecee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hf6kp" Oct 07 19:01:39 crc kubenswrapper[4825]: I1007 19:01:39.975081 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/974715c1-971d-48d0-8f4a-fff1a61aecee-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hf6kp\" (UID: \"974715c1-971d-48d0-8f4a-fff1a61aecee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hf6kp" Oct 07 19:01:40 crc kubenswrapper[4825]: I1007 19:01:40.076512 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/974715c1-971d-48d0-8f4a-fff1a61aecee-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hf6kp\" (UID: \"974715c1-971d-48d0-8f4a-fff1a61aecee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hf6kp" Oct 07 19:01:40 crc kubenswrapper[4825]: I1007 19:01:40.076580 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/974715c1-971d-48d0-8f4a-fff1a61aecee-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hf6kp\" (UID: \"974715c1-971d-48d0-8f4a-fff1a61aecee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hf6kp" Oct 07 19:01:40 crc kubenswrapper[4825]: I1007 19:01:40.076632 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/974715c1-971d-48d0-8f4a-fff1a61aecee-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hf6kp\" (UID: \"974715c1-971d-48d0-8f4a-fff1a61aecee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hf6kp" Oct 07 19:01:40 crc kubenswrapper[4825]: I1007 19:01:40.076707 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/974715c1-971d-48d0-8f4a-fff1a61aecee-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hf6kp\" (UID: \"974715c1-971d-48d0-8f4a-fff1a61aecee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hf6kp" Oct 07 19:01:40 crc kubenswrapper[4825]: I1007 19:01:40.076647 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/974715c1-971d-48d0-8f4a-fff1a61aecee-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hf6kp\" (UID: \"974715c1-971d-48d0-8f4a-fff1a61aecee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hf6kp" Oct 07 19:01:40 crc kubenswrapper[4825]: I1007 19:01:40.076800 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/974715c1-971d-48d0-8f4a-fff1a61aecee-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hf6kp\" (UID: \"974715c1-971d-48d0-8f4a-fff1a61aecee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hf6kp" Oct 07 19:01:40 crc kubenswrapper[4825]: I1007 19:01:40.076906 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/974715c1-971d-48d0-8f4a-fff1a61aecee-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hf6kp\" (UID: \"974715c1-971d-48d0-8f4a-fff1a61aecee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hf6kp" Oct 07 19:01:40 crc kubenswrapper[4825]: I1007 19:01:40.078375 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/974715c1-971d-48d0-8f4a-fff1a61aecee-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hf6kp\" (UID: \"974715c1-971d-48d0-8f4a-fff1a61aecee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hf6kp" Oct 07 19:01:40 crc kubenswrapper[4825]: I1007 19:01:40.086546 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/974715c1-971d-48d0-8f4a-fff1a61aecee-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hf6kp\" (UID: \"974715c1-971d-48d0-8f4a-fff1a61aecee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hf6kp" Oct 07 19:01:40 crc kubenswrapper[4825]: I1007 19:01:40.107184 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/974715c1-971d-48d0-8f4a-fff1a61aecee-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hf6kp\" (UID: \"974715c1-971d-48d0-8f4a-fff1a61aecee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hf6kp" Oct 07 19:01:40 crc kubenswrapper[4825]: I1007 19:01:40.185857 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hf6kp" Oct 07 19:01:40 crc kubenswrapper[4825]: W1007 19:01:40.211368 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod974715c1_971d_48d0_8f4a_fff1a61aecee.slice/crio-91846874a8cf2219834f357d5917c8398727e76faf2b1aa7ce94d8a61e4a34cb WatchSource:0}: Error finding container 91846874a8cf2219834f357d5917c8398727e76faf2b1aa7ce94d8a61e4a34cb: Status 404 returned error can't find the container with id 91846874a8cf2219834f357d5917c8398727e76faf2b1aa7ce94d8a61e4a34cb Oct 07 19:01:40 crc kubenswrapper[4825]: I1007 19:01:40.342054 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hf6kp" event={"ID":"974715c1-971d-48d0-8f4a-fff1a61aecee","Type":"ContainerStarted","Data":"91846874a8cf2219834f357d5917c8398727e76faf2b1aa7ce94d8a61e4a34cb"} Oct 07 19:01:41 crc kubenswrapper[4825]: I1007 19:01:41.348790 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hf6kp" event={"ID":"974715c1-971d-48d0-8f4a-fff1a61aecee","Type":"ContainerStarted","Data":"fb8e5db8b19cce99f99ae083c0dff910b0f54b926742a49799249b22cf343ccd"} Oct 07 19:01:41 crc kubenswrapper[4825]: I1007 19:01:41.795167 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:41 crc kubenswrapper[4825]: I1007 19:01:41.795191 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:41 crc kubenswrapper[4825]: I1007 19:01:41.795305 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:41 crc kubenswrapper[4825]: I1007 19:01:41.795317 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:41 crc kubenswrapper[4825]: E1007 19:01:41.797680 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:41 crc kubenswrapper[4825]: E1007 19:01:41.797971 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:41 crc kubenswrapper[4825]: E1007 19:01:41.798224 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:41 crc kubenswrapper[4825]: E1007 19:01:41.798452 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:43 crc kubenswrapper[4825]: I1007 19:01:43.795167 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:43 crc kubenswrapper[4825]: I1007 19:01:43.795187 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:43 crc kubenswrapper[4825]: E1007 19:01:43.795368 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:43 crc kubenswrapper[4825]: I1007 19:01:43.795251 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:43 crc kubenswrapper[4825]: E1007 19:01:43.795540 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:43 crc kubenswrapper[4825]: E1007 19:01:43.795741 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:43 crc kubenswrapper[4825]: I1007 19:01:43.796314 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:43 crc kubenswrapper[4825]: E1007 19:01:43.796447 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:45 crc kubenswrapper[4825]: I1007 19:01:45.795385 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:45 crc kubenswrapper[4825]: I1007 19:01:45.795478 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:45 crc kubenswrapper[4825]: I1007 19:01:45.795649 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:45 crc kubenswrapper[4825]: I1007 19:01:45.796000 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:45 crc kubenswrapper[4825]: E1007 19:01:45.795989 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:45 crc kubenswrapper[4825]: E1007 19:01:45.796075 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:45 crc kubenswrapper[4825]: E1007 19:01:45.796311 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:45 crc kubenswrapper[4825]: E1007 19:01:45.796473 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:46 crc kubenswrapper[4825]: I1007 19:01:46.796185 4825 scope.go:117] "RemoveContainer" containerID="4f77669353aaa0deb54b8519f6c7a7734f5a44001abcf2bb19baa55fd5c050ff" Oct 07 19:01:46 crc kubenswrapper[4825]: E1007 19:01:46.796409 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6lvdm_openshift-ovn-kubernetes(11546b62-cdda-449d-963e-418c2d4b6e46)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" Oct 07 19:01:47 crc kubenswrapper[4825]: I1007 19:01:47.794862 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:47 crc kubenswrapper[4825]: E1007 19:01:47.795439 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:47 crc kubenswrapper[4825]: I1007 19:01:47.794934 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:47 crc kubenswrapper[4825]: E1007 19:01:47.795664 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:47 crc kubenswrapper[4825]: I1007 19:01:47.794873 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:47 crc kubenswrapper[4825]: E1007 19:01:47.795846 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:47 crc kubenswrapper[4825]: I1007 19:01:47.794961 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:47 crc kubenswrapper[4825]: E1007 19:01:47.796025 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:49 crc kubenswrapper[4825]: I1007 19:01:49.794792 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:49 crc kubenswrapper[4825]: I1007 19:01:49.794915 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:49 crc kubenswrapper[4825]: I1007 19:01:49.794842 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:49 crc kubenswrapper[4825]: I1007 19:01:49.794929 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:49 crc kubenswrapper[4825]: E1007 19:01:49.795075 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:49 crc kubenswrapper[4825]: E1007 19:01:49.795264 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:49 crc kubenswrapper[4825]: E1007 19:01:49.795418 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:49 crc kubenswrapper[4825]: E1007 19:01:49.795610 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:50 crc kubenswrapper[4825]: I1007 19:01:50.208225 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs\") pod \"network-metrics-daemon-bvwh2\" (UID: \"ee9b984f-baa3-429f-b929-3d61d5e204bc\") " pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:50 crc kubenswrapper[4825]: E1007 19:01:50.208539 4825 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 19:01:50 crc kubenswrapper[4825]: E1007 19:01:50.208615 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs podName:ee9b984f-baa3-429f-b929-3d61d5e204bc nodeName:}" failed. No retries permitted until 2025-10-07 19:02:54.208591613 +0000 UTC m=+163.030630280 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs") pod "network-metrics-daemon-bvwh2" (UID: "ee9b984f-baa3-429f-b929-3d61d5e204bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 19:01:51 crc kubenswrapper[4825]: I1007 19:01:51.795363 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:51 crc kubenswrapper[4825]: I1007 19:01:51.795363 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:51 crc kubenswrapper[4825]: I1007 19:01:51.795475 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:51 crc kubenswrapper[4825]: I1007 19:01:51.795493 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:51 crc kubenswrapper[4825]: E1007 19:01:51.797768 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:51 crc kubenswrapper[4825]: E1007 19:01:51.798264 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:51 crc kubenswrapper[4825]: E1007 19:01:51.798445 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:51 crc kubenswrapper[4825]: E1007 19:01:51.798561 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:53 crc kubenswrapper[4825]: I1007 19:01:53.795681 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:53 crc kubenswrapper[4825]: I1007 19:01:53.795733 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:53 crc kubenswrapper[4825]: I1007 19:01:53.795681 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:53 crc kubenswrapper[4825]: I1007 19:01:53.795838 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:53 crc kubenswrapper[4825]: E1007 19:01:53.796109 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:53 crc kubenswrapper[4825]: E1007 19:01:53.796389 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:53 crc kubenswrapper[4825]: E1007 19:01:53.796739 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:53 crc kubenswrapper[4825]: E1007 19:01:53.796896 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:55 crc kubenswrapper[4825]: I1007 19:01:55.794459 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:55 crc kubenswrapper[4825]: E1007 19:01:55.794623 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:55 crc kubenswrapper[4825]: I1007 19:01:55.794700 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:55 crc kubenswrapper[4825]: I1007 19:01:55.794750 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:55 crc kubenswrapper[4825]: E1007 19:01:55.794850 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:55 crc kubenswrapper[4825]: I1007 19:01:55.794879 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:55 crc kubenswrapper[4825]: E1007 19:01:55.794921 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:55 crc kubenswrapper[4825]: E1007 19:01:55.795117 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:57 crc kubenswrapper[4825]: I1007 19:01:57.794758 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:57 crc kubenswrapper[4825]: E1007 19:01:57.794956 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:01:57 crc kubenswrapper[4825]: I1007 19:01:57.795347 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:57 crc kubenswrapper[4825]: E1007 19:01:57.795457 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:57 crc kubenswrapper[4825]: I1007 19:01:57.795512 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:57 crc kubenswrapper[4825]: I1007 19:01:57.795525 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:57 crc kubenswrapper[4825]: E1007 19:01:57.796327 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:57 crc kubenswrapper[4825]: E1007 19:01:57.796425 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:57 crc kubenswrapper[4825]: I1007 19:01:57.796720 4825 scope.go:117] "RemoveContainer" containerID="4f77669353aaa0deb54b8519f6c7a7734f5a44001abcf2bb19baa55fd5c050ff" Oct 07 19:01:57 crc kubenswrapper[4825]: E1007 19:01:57.796978 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6lvdm_openshift-ovn-kubernetes(11546b62-cdda-449d-963e-418c2d4b6e46)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" Oct 07 19:01:59 crc kubenswrapper[4825]: I1007 19:01:59.794936 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:01:59 crc kubenswrapper[4825]: I1007 19:01:59.795007 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:01:59 crc kubenswrapper[4825]: I1007 19:01:59.795024 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:01:59 crc kubenswrapper[4825]: I1007 19:01:59.795199 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:01:59 crc kubenswrapper[4825]: E1007 19:01:59.795181 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:01:59 crc kubenswrapper[4825]: E1007 19:01:59.795335 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:01:59 crc kubenswrapper[4825]: E1007 19:01:59.795561 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:01:59 crc kubenswrapper[4825]: E1007 19:01:59.795699 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:02:01 crc kubenswrapper[4825]: I1007 19:02:01.795169 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:02:01 crc kubenswrapper[4825]: I1007 19:02:01.795163 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:02:01 crc kubenswrapper[4825]: E1007 19:02:01.796694 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:02:01 crc kubenswrapper[4825]: I1007 19:02:01.796811 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:02:01 crc kubenswrapper[4825]: E1007 19:02:01.797067 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:02:01 crc kubenswrapper[4825]: E1007 19:02:01.797121 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:02:01 crc kubenswrapper[4825]: I1007 19:02:01.797183 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:02:01 crc kubenswrapper[4825]: E1007 19:02:01.797485 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:02:03 crc kubenswrapper[4825]: I1007 19:02:03.795195 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:02:03 crc kubenswrapper[4825]: I1007 19:02:03.795192 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:02:03 crc kubenswrapper[4825]: I1007 19:02:03.795393 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:02:03 crc kubenswrapper[4825]: E1007 19:02:03.796270 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:02:03 crc kubenswrapper[4825]: E1007 19:02:03.795946 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:02:03 crc kubenswrapper[4825]: I1007 19:02:03.795711 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:02:03 crc kubenswrapper[4825]: E1007 19:02:03.796463 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:02:03 crc kubenswrapper[4825]: E1007 19:02:03.796559 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:02:05 crc kubenswrapper[4825]: I1007 19:02:05.795170 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:02:05 crc kubenswrapper[4825]: E1007 19:02:05.795839 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:02:05 crc kubenswrapper[4825]: I1007 19:02:05.795192 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:02:05 crc kubenswrapper[4825]: I1007 19:02:05.795378 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:02:05 crc kubenswrapper[4825]: I1007 19:02:05.795304 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:02:05 crc kubenswrapper[4825]: E1007 19:02:05.796367 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:02:05 crc kubenswrapper[4825]: E1007 19:02:05.796505 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:02:05 crc kubenswrapper[4825]: E1007 19:02:05.796209 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:02:06 crc kubenswrapper[4825]: I1007 19:02:06.442222 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zk9x9_44f62e96-26a6-4bfe-8e8c-6884216bd363/kube-multus/1.log" Oct 07 19:02:06 crc kubenswrapper[4825]: I1007 19:02:06.442839 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zk9x9_44f62e96-26a6-4bfe-8e8c-6884216bd363/kube-multus/0.log" Oct 07 19:02:06 crc kubenswrapper[4825]: I1007 19:02:06.442916 4825 generic.go:334] "Generic (PLEG): container finished" podID="44f62e96-26a6-4bfe-8e8c-6884216bd363" containerID="58e5cbd6853b21641655497f3c250645e7ea086a9dfe7d7e6b941b1cdabc5953" exitCode=1 Oct 07 19:02:06 crc kubenswrapper[4825]: I1007 19:02:06.442977 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zk9x9" event={"ID":"44f62e96-26a6-4bfe-8e8c-6884216bd363","Type":"ContainerDied","Data":"58e5cbd6853b21641655497f3c250645e7ea086a9dfe7d7e6b941b1cdabc5953"} Oct 07 19:02:06 crc kubenswrapper[4825]: I1007 19:02:06.443055 4825 scope.go:117] "RemoveContainer" containerID="ddf1d70084061ce6e41c6310e7b5eaa96dfd3fc3cb6d2f8af01e3bae6f5d4c71" Oct 07 19:02:06 crc kubenswrapper[4825]: I1007 19:02:06.443781 4825 scope.go:117] "RemoveContainer" containerID="58e5cbd6853b21641655497f3c250645e7ea086a9dfe7d7e6b941b1cdabc5953" Oct 07 19:02:06 crc kubenswrapper[4825]: E1007 19:02:06.444109 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-zk9x9_openshift-multus(44f62e96-26a6-4bfe-8e8c-6884216bd363)\"" pod="openshift-multus/multus-zk9x9" podUID="44f62e96-26a6-4bfe-8e8c-6884216bd363" Oct 07 19:02:06 crc kubenswrapper[4825]: I1007 19:02:06.472751 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hf6kp" podStartSLOduration=95.472721291 podStartE2EDuration="1m35.472721291s" podCreationTimestamp="2025-10-07 19:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:01:41.369145649 +0000 UTC m=+90.191184316" watchObservedRunningTime="2025-10-07 19:02:06.472721291 +0000 UTC m=+115.294759968" Oct 07 19:02:07 crc kubenswrapper[4825]: I1007 19:02:07.449892 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zk9x9_44f62e96-26a6-4bfe-8e8c-6884216bd363/kube-multus/1.log" Oct 07 19:02:07 crc kubenswrapper[4825]: I1007 19:02:07.794972 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:02:07 crc kubenswrapper[4825]: I1007 19:02:07.795030 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:02:07 crc kubenswrapper[4825]: I1007 19:02:07.794972 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:02:07 crc kubenswrapper[4825]: E1007 19:02:07.795220 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:02:07 crc kubenswrapper[4825]: E1007 19:02:07.795379 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:02:07 crc kubenswrapper[4825]: E1007 19:02:07.795520 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:02:07 crc kubenswrapper[4825]: I1007 19:02:07.795709 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:02:07 crc kubenswrapper[4825]: E1007 19:02:07.795800 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:02:09 crc kubenswrapper[4825]: I1007 19:02:09.795507 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:02:09 crc kubenswrapper[4825]: I1007 19:02:09.795688 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:02:09 crc kubenswrapper[4825]: I1007 19:02:09.795944 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:02:09 crc kubenswrapper[4825]: E1007 19:02:09.795925 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:02:09 crc kubenswrapper[4825]: I1007 19:02:09.796046 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:02:09 crc kubenswrapper[4825]: E1007 19:02:09.796132 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:02:09 crc kubenswrapper[4825]: E1007 19:02:09.796348 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:02:09 crc kubenswrapper[4825]: E1007 19:02:09.796438 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:02:11 crc kubenswrapper[4825]: I1007 19:02:11.794855 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:02:11 crc kubenswrapper[4825]: I1007 19:02:11.794904 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:02:11 crc kubenswrapper[4825]: I1007 19:02:11.795412 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:02:11 crc kubenswrapper[4825]: I1007 19:02:11.795981 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:02:11 crc kubenswrapper[4825]: E1007 19:02:11.796676 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:02:11 crc kubenswrapper[4825]: E1007 19:02:11.797008 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:02:11 crc kubenswrapper[4825]: E1007 19:02:11.797136 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:02:11 crc kubenswrapper[4825]: E1007 19:02:11.797279 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:02:11 crc kubenswrapper[4825]: E1007 19:02:11.807680 4825 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 07 19:02:11 crc kubenswrapper[4825]: E1007 19:02:11.903735 4825 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 19:02:12 crc kubenswrapper[4825]: I1007 19:02:12.796645 4825 scope.go:117] "RemoveContainer" containerID="4f77669353aaa0deb54b8519f6c7a7734f5a44001abcf2bb19baa55fd5c050ff" Oct 07 19:02:12 crc kubenswrapper[4825]: E1007 19:02:12.797631 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6lvdm_openshift-ovn-kubernetes(11546b62-cdda-449d-963e-418c2d4b6e46)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" Oct 07 19:02:13 crc kubenswrapper[4825]: I1007 19:02:13.795096 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:02:13 crc kubenswrapper[4825]: I1007 19:02:13.795175 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:02:13 crc kubenswrapper[4825]: I1007 19:02:13.795191 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:02:13 crc kubenswrapper[4825]: I1007 19:02:13.795346 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:02:13 crc kubenswrapper[4825]: E1007 19:02:13.795338 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:02:13 crc kubenswrapper[4825]: E1007 19:02:13.795465 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:02:13 crc kubenswrapper[4825]: E1007 19:02:13.795597 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:02:13 crc kubenswrapper[4825]: E1007 19:02:13.795833 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:02:15 crc kubenswrapper[4825]: I1007 19:02:15.794778 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:02:15 crc kubenswrapper[4825]: I1007 19:02:15.794873 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:02:15 crc kubenswrapper[4825]: E1007 19:02:15.794969 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:02:15 crc kubenswrapper[4825]: I1007 19:02:15.795004 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:02:15 crc kubenswrapper[4825]: I1007 19:02:15.795078 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:02:15 crc kubenswrapper[4825]: E1007 19:02:15.795355 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:02:15 crc kubenswrapper[4825]: E1007 19:02:15.795496 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:02:15 crc kubenswrapper[4825]: E1007 19:02:15.795687 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:02:16 crc kubenswrapper[4825]: E1007 19:02:16.906265 4825 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 19:02:17 crc kubenswrapper[4825]: I1007 19:02:17.795321 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:02:17 crc kubenswrapper[4825]: I1007 19:02:17.795432 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:02:17 crc kubenswrapper[4825]: I1007 19:02:17.795335 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:02:17 crc kubenswrapper[4825]: E1007 19:02:17.795548 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:02:17 crc kubenswrapper[4825]: I1007 19:02:17.795602 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:02:17 crc kubenswrapper[4825]: E1007 19:02:17.795704 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:02:17 crc kubenswrapper[4825]: E1007 19:02:17.795926 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:02:17 crc kubenswrapper[4825]: E1007 19:02:17.795990 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:02:19 crc kubenswrapper[4825]: I1007 19:02:19.795040 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:02:19 crc kubenswrapper[4825]: I1007 19:02:19.795108 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:02:19 crc kubenswrapper[4825]: E1007 19:02:19.795808 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:02:19 crc kubenswrapper[4825]: I1007 19:02:19.795194 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:02:19 crc kubenswrapper[4825]: E1007 19:02:19.795902 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:02:19 crc kubenswrapper[4825]: I1007 19:02:19.795175 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:02:19 crc kubenswrapper[4825]: E1007 19:02:19.796007 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:02:19 crc kubenswrapper[4825]: E1007 19:02:19.796119 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:02:20 crc kubenswrapper[4825]: I1007 19:02:20.795675 4825 scope.go:117] "RemoveContainer" containerID="58e5cbd6853b21641655497f3c250645e7ea086a9dfe7d7e6b941b1cdabc5953" Oct 07 19:02:21 crc kubenswrapper[4825]: I1007 19:02:21.505726 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zk9x9_44f62e96-26a6-4bfe-8e8c-6884216bd363/kube-multus/1.log" Oct 07 19:02:21 crc kubenswrapper[4825]: I1007 19:02:21.506102 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zk9x9" event={"ID":"44f62e96-26a6-4bfe-8e8c-6884216bd363","Type":"ContainerStarted","Data":"7b220af5033e5f708bf3bc3586aa956717a4b5f61911848ffc4808a2221bcaa4"} Oct 07 19:02:21 crc kubenswrapper[4825]: I1007 19:02:21.794885 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:02:21 crc kubenswrapper[4825]: I1007 19:02:21.794924 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:02:21 crc kubenswrapper[4825]: I1007 19:02:21.794953 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:02:21 crc kubenswrapper[4825]: I1007 19:02:21.795014 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:02:21 crc kubenswrapper[4825]: E1007 19:02:21.796314 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:02:21 crc kubenswrapper[4825]: E1007 19:02:21.796970 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:02:21 crc kubenswrapper[4825]: E1007 19:02:21.797087 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:02:21 crc kubenswrapper[4825]: E1007 19:02:21.796835 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:02:21 crc kubenswrapper[4825]: E1007 19:02:21.907171 4825 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 19:02:23 crc kubenswrapper[4825]: I1007 19:02:23.795265 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:02:23 crc kubenswrapper[4825]: I1007 19:02:23.795428 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:02:23 crc kubenswrapper[4825]: E1007 19:02:23.795501 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:02:23 crc kubenswrapper[4825]: I1007 19:02:23.795581 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:02:23 crc kubenswrapper[4825]: I1007 19:02:23.795645 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:02:23 crc kubenswrapper[4825]: E1007 19:02:23.795791 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:02:23 crc kubenswrapper[4825]: E1007 19:02:23.795925 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:02:23 crc kubenswrapper[4825]: E1007 19:02:23.796082 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:02:23 crc kubenswrapper[4825]: I1007 19:02:23.796933 4825 scope.go:117] "RemoveContainer" containerID="4f77669353aaa0deb54b8519f6c7a7734f5a44001abcf2bb19baa55fd5c050ff" Oct 07 19:02:24 crc kubenswrapper[4825]: I1007 19:02:24.518363 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6lvdm_11546b62-cdda-449d-963e-418c2d4b6e46/ovnkube-controller/3.log" Oct 07 19:02:24 crc kubenswrapper[4825]: I1007 19:02:24.521758 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerStarted","Data":"4e5cb1687aec0c71724fc5f61de5151d3cc9b7e2dad5ae77d4306a015abd7aeb"} Oct 07 19:02:24 crc kubenswrapper[4825]: I1007 19:02:24.522332 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:02:24 crc kubenswrapper[4825]: I1007 19:02:24.558315 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" podStartSLOduration=113.558291517 podStartE2EDuration="1m53.558291517s" podCreationTimestamp="2025-10-07 19:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:24.556757187 +0000 UTC m=+133.378795834" watchObservedRunningTime="2025-10-07 19:02:24.558291517 +0000 UTC m=+133.380330184" Oct 07 19:02:24 crc kubenswrapper[4825]: I1007 19:02:24.704072 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bvwh2"] Oct 07 19:02:24 crc kubenswrapper[4825]: I1007 19:02:24.704224 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:02:24 crc kubenswrapper[4825]: E1007 19:02:24.704392 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:02:25 crc kubenswrapper[4825]: I1007 19:02:25.795599 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:02:25 crc kubenswrapper[4825]: I1007 19:02:25.795709 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:02:25 crc kubenswrapper[4825]: I1007 19:02:25.795599 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:02:25 crc kubenswrapper[4825]: E1007 19:02:25.795874 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:02:25 crc kubenswrapper[4825]: E1007 19:02:25.796079 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:02:25 crc kubenswrapper[4825]: E1007 19:02:25.796273 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:02:26 crc kubenswrapper[4825]: I1007 19:02:26.795299 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:02:26 crc kubenswrapper[4825]: E1007 19:02:26.795453 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:02:26 crc kubenswrapper[4825]: E1007 19:02:26.908666 4825 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 19:02:27 crc kubenswrapper[4825]: I1007 19:02:27.794775 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:02:27 crc kubenswrapper[4825]: I1007 19:02:27.794849 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:02:27 crc kubenswrapper[4825]: I1007 19:02:27.794775 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:02:27 crc kubenswrapper[4825]: E1007 19:02:27.794999 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:02:27 crc kubenswrapper[4825]: E1007 19:02:27.795131 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:02:27 crc kubenswrapper[4825]: E1007 19:02:27.795340 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:02:28 crc kubenswrapper[4825]: I1007 19:02:28.794445 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:02:28 crc kubenswrapper[4825]: E1007 19:02:28.795539 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:02:29 crc kubenswrapper[4825]: I1007 19:02:29.794658 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:02:29 crc kubenswrapper[4825]: I1007 19:02:29.794755 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:02:29 crc kubenswrapper[4825]: E1007 19:02:29.794862 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:02:29 crc kubenswrapper[4825]: E1007 19:02:29.794956 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:02:29 crc kubenswrapper[4825]: I1007 19:02:29.795434 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:02:29 crc kubenswrapper[4825]: E1007 19:02:29.795591 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:02:30 crc kubenswrapper[4825]: I1007 19:02:30.794744 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:02:30 crc kubenswrapper[4825]: E1007 19:02:30.794953 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvwh2" podUID="ee9b984f-baa3-429f-b929-3d61d5e204bc" Oct 07 19:02:31 crc kubenswrapper[4825]: I1007 19:02:31.794614 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:02:31 crc kubenswrapper[4825]: I1007 19:02:31.794752 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:02:31 crc kubenswrapper[4825]: E1007 19:02:31.794794 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 19:02:31 crc kubenswrapper[4825]: I1007 19:02:31.794907 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:02:31 crc kubenswrapper[4825]: E1007 19:02:31.795096 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 19:02:31 crc kubenswrapper[4825]: E1007 19:02:31.798605 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 19:02:32 crc kubenswrapper[4825]: I1007 19:02:32.794914 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:02:32 crc kubenswrapper[4825]: I1007 19:02:32.799016 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 07 19:02:32 crc kubenswrapper[4825]: I1007 19:02:32.799805 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 07 19:02:33 crc kubenswrapper[4825]: I1007 19:02:33.798411 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:02:33 crc kubenswrapper[4825]: I1007 19:02:33.799913 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:02:33 crc kubenswrapper[4825]: I1007 19:02:33.800037 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:02:33 crc kubenswrapper[4825]: I1007 19:02:33.802364 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 07 19:02:33 crc kubenswrapper[4825]: I1007 19:02:33.802414 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 07 19:02:33 crc kubenswrapper[4825]: I1007 19:02:33.804292 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 07 19:02:33 crc kubenswrapper[4825]: I1007 19:02:33.805150 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 07 19:02:35 crc kubenswrapper[4825]: I1007 19:02:35.709542 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:02:35 crc kubenswrapper[4825]: I1007 19:02:35.710356 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:02:39 crc kubenswrapper[4825]: I1007 19:02:39.910649 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:39 crc kubenswrapper[4825]: E1007 19:02:39.910882 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:04:41.910842563 +0000 UTC m=+270.732881240 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:39 crc kubenswrapper[4825]: I1007 19:02:39.911390 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:02:39 crc kubenswrapper[4825]: I1007 19:02:39.911442 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:02:39 crc kubenswrapper[4825]: I1007 19:02:39.912601 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:02:39 crc kubenswrapper[4825]: I1007 19:02:39.919914 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.012528 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.012635 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.018376 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.018385 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.123919 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.142067 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.156777 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.535050 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.583155 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x5nrv"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.584051 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.586967 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7hvsq"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.587955 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7hvsq" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.592469 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.593813 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.594381 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.594959 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.595461 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.595923 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.597001 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.597552 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rp8vt"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.597901 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.603025 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.603070 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.604499 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.621166 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3cea192-f8e9-426c-887e-68a8d8f2dad5-config\") pod \"machine-api-operator-5694c8668f-7hvsq\" (UID: \"b3cea192-f8e9-426c-887e-68a8d8f2dad5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7hvsq" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.621347 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08f97853-1190-438d-91b7-f811400b541c-client-ca\") pod \"controller-manager-879f6c89f-x5nrv\" (UID: \"08f97853-1190-438d-91b7-f811400b541c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.621403 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b3cea192-f8e9-426c-887e-68a8d8f2dad5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7hvsq\" (UID: \"b3cea192-f8e9-426c-887e-68a8d8f2dad5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7hvsq" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.621445 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgdp9\" (UniqueName: \"kubernetes.io/projected/b3cea192-f8e9-426c-887e-68a8d8f2dad5-kube-api-access-xgdp9\") pod \"machine-api-operator-5694c8668f-7hvsq\" (UID: \"b3cea192-f8e9-426c-887e-68a8d8f2dad5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7hvsq" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.621503 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hzwl\" (UniqueName: \"kubernetes.io/projected/08f97853-1190-438d-91b7-f811400b541c-kube-api-access-2hzwl\") pod \"controller-manager-879f6c89f-x5nrv\" (UID: \"08f97853-1190-438d-91b7-f811400b541c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.621545 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b3cea192-f8e9-426c-887e-68a8d8f2dad5-images\") pod \"machine-api-operator-5694c8668f-7hvsq\" (UID: \"b3cea192-f8e9-426c-887e-68a8d8f2dad5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7hvsq" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.621566 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.621578 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08f97853-1190-438d-91b7-f811400b541c-config\") pod \"controller-manager-879f6c89f-x5nrv\" (UID: \"08f97853-1190-438d-91b7-f811400b541c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.621734 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.622586 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.623072 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.623150 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.624185 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08f97853-1190-438d-91b7-f811400b541c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x5nrv\" (UID: \"08f97853-1190-438d-91b7-f811400b541c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.624404 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08f97853-1190-438d-91b7-f811400b541c-serving-cert\") pod \"controller-manager-879f6c89f-x5nrv\" (UID: \"08f97853-1190-438d-91b7-f811400b541c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.628533 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.628809 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.629220 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.630168 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.633092 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.633349 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.633484 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.633616 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.635810 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.636537 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.636831 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.636841 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.637057 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.637380 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.637869 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.637950 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.640055 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnfts"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.640702 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.640726 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnfts" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.641616 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-sqfnk"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.642335 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.655579 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.656607 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.656757 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.656891 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.656964 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.657273 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.657656 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.657934 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.658091 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.658769 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-lnxqj"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.659596 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lnxqj" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.660524 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ll4q5"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.660915 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-hpckv"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.661191 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-hpckv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.661206 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-ll4q5" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.666654 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.667696 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.668356 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mdzv5"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.668769 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.668809 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x5nrv"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.668890 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mdzv5" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.669328 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-x74mv"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.669544 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.669803 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.669942 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.672836 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7hvsq"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.669989 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.670040 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.670077 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.670104 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.670187 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.670422 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.670601 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.670644 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.670752 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.679183 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.679380 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.679654 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.680505 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.680664 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.679207 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.683223 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.691681 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.696382 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-hpckv"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.696436 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wrcjq"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.697578 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.697800 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.698531 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.698759 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.707310 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.707409 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.707477 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.707482 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.707745 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.708001 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.708425 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.716213 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wrcjq" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.728796 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.728938 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.729130 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.730442 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.730987 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.731342 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.731520 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.734149 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.734932 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vntx8\" (UniqueName: \"kubernetes.io/projected/f9005f09-0f66-4541-8cb0-725ba2f4380d-kube-api-access-vntx8\") pod \"downloads-7954f5f757-hpckv\" (UID: \"f9005f09-0f66-4541-8cb0-725ba2f4380d\") " pod="openshift-console/downloads-7954f5f757-hpckv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.734962 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d9758819-ad32-401c-a327-bb0dd9740946-auth-proxy-config\") pod \"machine-approver-56656f9798-lnxqj\" (UID: \"d9758819-ad32-401c-a327-bb0dd9740946\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lnxqj" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.734991 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08f97853-1190-438d-91b7-f811400b541c-client-ca\") pod \"controller-manager-879f6c89f-x5nrv\" (UID: \"08f97853-1190-438d-91b7-f811400b541c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735009 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a07b3c13-9b79-45d7-a759-e9c119bbe37b-serving-cert\") pod \"apiserver-7bbb656c7d-dh96g\" (UID: \"a07b3c13-9b79-45d7-a759-e9c119bbe37b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735033 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b3cea192-f8e9-426c-887e-68a8d8f2dad5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7hvsq\" (UID: \"b3cea192-f8e9-426c-887e-68a8d8f2dad5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7hvsq" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735050 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735071 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735087 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/21bd5368-2631-4c6c-94cf-d6e64b1dd657-service-ca\") pod \"console-f9d7485db-sqfnk\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735111 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgdp9\" (UniqueName: \"kubernetes.io/projected/b3cea192-f8e9-426c-887e-68a8d8f2dad5-kube-api-access-xgdp9\") pod \"machine-api-operator-5694c8668f-7hvsq\" (UID: \"b3cea192-f8e9-426c-887e-68a8d8f2dad5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7hvsq" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735127 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/21bd5368-2631-4c6c-94cf-d6e64b1dd657-oauth-serving-cert\") pod \"console-f9d7485db-sqfnk\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735145 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3b4f007-8217-4308-996e-394b0c3d072c-config\") pod \"route-controller-manager-6576b87f9c-qr8n7\" (UID: \"a3b4f007-8217-4308-996e-394b0c3d072c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735200 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hzwl\" (UniqueName: \"kubernetes.io/projected/08f97853-1190-438d-91b7-f811400b541c-kube-api-access-2hzwl\") pod \"controller-manager-879f6c89f-x5nrv\" (UID: \"08f97853-1190-438d-91b7-f811400b541c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735222 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhcmk\" (UniqueName: \"kubernetes.io/projected/5c60c8b3-aa32-442a-a222-3ed689f4dd61-kube-api-access-nhcmk\") pod \"console-operator-58897d9998-ll4q5\" (UID: \"5c60c8b3-aa32-442a-a222-3ed689f4dd61\") " pod="openshift-console-operator/console-operator-58897d9998-ll4q5" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735249 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21bd5368-2631-4c6c-94cf-d6e64b1dd657-trusted-ca-bundle\") pod \"console-f9d7485db-sqfnk\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735267 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735283 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/12a12314-9e91-4b18-b2d6-f41489add427-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wnfts\" (UID: \"12a12314-9e91-4b18-b2d6-f41489add427\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnfts" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735304 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08f97853-1190-438d-91b7-f811400b541c-serving-cert\") pod \"controller-manager-879f6c89f-x5nrv\" (UID: \"08f97853-1190-438d-91b7-f811400b541c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735319 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d9758819-ad32-401c-a327-bb0dd9740946-machine-approver-tls\") pod \"machine-approver-56656f9798-lnxqj\" (UID: \"d9758819-ad32-401c-a327-bb0dd9740946\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lnxqj" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735335 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a07b3c13-9b79-45d7-a759-e9c119bbe37b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dh96g\" (UID: \"a07b3c13-9b79-45d7-a759-e9c119bbe37b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735350 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735368 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a07b3c13-9b79-45d7-a759-e9c119bbe37b-encryption-config\") pod \"apiserver-7bbb656c7d-dh96g\" (UID: \"a07b3c13-9b79-45d7-a759-e9c119bbe37b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735382 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-audit-dir\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735447 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkx4z\" (UniqueName: \"kubernetes.io/projected/a07b3c13-9b79-45d7-a759-e9c119bbe37b-kube-api-access-dkx4z\") pod \"apiserver-7bbb656c7d-dh96g\" (UID: \"a07b3c13-9b79-45d7-a759-e9c119bbe37b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735480 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1d77404b-ecd2-497c-9f7c-ec1ff470755e-etcd-client\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735498 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td5m8\" (UniqueName: \"kubernetes.io/projected/d9758819-ad32-401c-a327-bb0dd9740946-kube-api-access-td5m8\") pod \"machine-approver-56656f9798-lnxqj\" (UID: \"d9758819-ad32-401c-a327-bb0dd9740946\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lnxqj" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735515 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a07b3c13-9b79-45d7-a759-e9c119bbe37b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dh96g\" (UID: \"a07b3c13-9b79-45d7-a759-e9c119bbe37b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735532 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735553 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c60c8b3-aa32-442a-a222-3ed689f4dd61-serving-cert\") pod \"console-operator-58897d9998-ll4q5\" (UID: \"5c60c8b3-aa32-442a-a222-3ed689f4dd61\") " pod="openshift-console-operator/console-operator-58897d9998-ll4q5" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735574 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3cea192-f8e9-426c-887e-68a8d8f2dad5-config\") pod \"machine-api-operator-5694c8668f-7hvsq\" (UID: \"b3cea192-f8e9-426c-887e-68a8d8f2dad5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7hvsq" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735588 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1d77404b-ecd2-497c-9f7c-ec1ff470755e-audit\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735603 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d77404b-ecd2-497c-9f7c-ec1ff470755e-serving-cert\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735619 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735639 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0581c024-1217-4c9d-b927-45a4327e8eec-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mdzv5\" (UID: \"0581c024-1217-4c9d-b927-45a4327e8eec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mdzv5" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735656 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a07b3c13-9b79-45d7-a759-e9c119bbe37b-audit-policies\") pod \"apiserver-7bbb656c7d-dh96g\" (UID: \"a07b3c13-9b79-45d7-a759-e9c119bbe37b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735674 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735689 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1d77404b-ecd2-497c-9f7c-ec1ff470755e-encryption-config\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735705 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735719 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvljb\" (UniqueName: \"kubernetes.io/projected/21bd5368-2631-4c6c-94cf-d6e64b1dd657-kube-api-access-fvljb\") pod \"console-f9d7485db-sqfnk\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735806 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a07b3c13-9b79-45d7-a759-e9c119bbe37b-etcd-client\") pod \"apiserver-7bbb656c7d-dh96g\" (UID: \"a07b3c13-9b79-45d7-a759-e9c119bbe37b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735893 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d77404b-ecd2-497c-9f7c-ec1ff470755e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735911 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9758819-ad32-401c-a327-bb0dd9740946-config\") pod \"machine-approver-56656f9798-lnxqj\" (UID: \"d9758819-ad32-401c-a327-bb0dd9740946\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lnxqj" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735933 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1d77404b-ecd2-497c-9f7c-ec1ff470755e-node-pullsecrets\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735949 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-audit-policies\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735966 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d77404b-ecd2-497c-9f7c-ec1ff470755e-audit-dir\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.736032 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2t28\" (UniqueName: \"kubernetes.io/projected/1d77404b-ecd2-497c-9f7c-ec1ff470755e-kube-api-access-l2t28\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.736086 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/21bd5368-2631-4c6c-94cf-d6e64b1dd657-console-oauth-config\") pod \"console-f9d7485db-sqfnk\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.736146 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d77404b-ecd2-497c-9f7c-ec1ff470755e-config\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.736176 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1d77404b-ecd2-497c-9f7c-ec1ff470755e-etcd-serving-ca\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.736192 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/21bd5368-2631-4c6c-94cf-d6e64b1dd657-console-serving-cert\") pod \"console-f9d7485db-sqfnk\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.736207 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c60c8b3-aa32-442a-a222-3ed689f4dd61-trusted-ca\") pod \"console-operator-58897d9998-ll4q5\" (UID: \"5c60c8b3-aa32-442a-a222-3ed689f4dd61\") " pod="openshift-console-operator/console-operator-58897d9998-ll4q5" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.736222 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzrp8\" (UniqueName: \"kubernetes.io/projected/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-kube-api-access-tzrp8\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.736271 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b3cea192-f8e9-426c-887e-68a8d8f2dad5-images\") pod \"machine-api-operator-5694c8668f-7hvsq\" (UID: \"b3cea192-f8e9-426c-887e-68a8d8f2dad5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7hvsq" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.736285 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a07b3c13-9b79-45d7-a759-e9c119bbe37b-audit-dir\") pod \"apiserver-7bbb656c7d-dh96g\" (UID: \"a07b3c13-9b79-45d7-a759-e9c119bbe37b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.736318 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08f97853-1190-438d-91b7-f811400b541c-config\") pod \"controller-manager-879f6c89f-x5nrv\" (UID: \"08f97853-1190-438d-91b7-f811400b541c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.736336 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.736354 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08f97853-1190-438d-91b7-f811400b541c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x5nrv\" (UID: \"08f97853-1190-438d-91b7-f811400b541c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.736376 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0581c024-1217-4c9d-b927-45a4327e8eec-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mdzv5\" (UID: \"0581c024-1217-4c9d-b927-45a4327e8eec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mdzv5" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.736442 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjczj\" (UniqueName: \"kubernetes.io/projected/0581c024-1217-4c9d-b927-45a4327e8eec-kube-api-access-cjczj\") pod \"cluster-image-registry-operator-dc59b4c8b-mdzv5\" (UID: \"0581c024-1217-4c9d-b927-45a4327e8eec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mdzv5" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.736499 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c60c8b3-aa32-442a-a222-3ed689f4dd61-config\") pod \"console-operator-58897d9998-ll4q5\" (UID: \"5c60c8b3-aa32-442a-a222-3ed689f4dd61\") " pod="openshift-console-operator/console-operator-58897d9998-ll4q5" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.736517 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.736543 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1d77404b-ecd2-497c-9f7c-ec1ff470755e-image-import-ca\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.736564 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwtps\" (UniqueName: \"kubernetes.io/projected/12a12314-9e91-4b18-b2d6-f41489add427-kube-api-access-kwtps\") pod \"cluster-samples-operator-665b6dd947-wnfts\" (UID: \"12a12314-9e91-4b18-b2d6-f41489add427\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnfts" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.736579 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3b4f007-8217-4308-996e-394b0c3d072c-client-ca\") pod \"route-controller-manager-6576b87f9c-qr8n7\" (UID: \"a3b4f007-8217-4308-996e-394b0c3d072c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.736596 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/21bd5368-2631-4c6c-94cf-d6e64b1dd657-console-config\") pod \"console-f9d7485db-sqfnk\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.736613 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3b4f007-8217-4308-996e-394b0c3d072c-serving-cert\") pod \"route-controller-manager-6576b87f9c-qr8n7\" (UID: \"a3b4f007-8217-4308-996e-394b0c3d072c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.736632 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0581c024-1217-4c9d-b927-45a4327e8eec-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mdzv5\" (UID: \"0581c024-1217-4c9d-b927-45a4327e8eec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mdzv5" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.736647 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.736664 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d4n7\" (UniqueName: \"kubernetes.io/projected/a3b4f007-8217-4308-996e-394b0c3d072c-kube-api-access-2d4n7\") pod \"route-controller-manager-6576b87f9c-qr8n7\" (UID: \"a3b4f007-8217-4308-996e-394b0c3d072c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.737450 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08f97853-1190-438d-91b7-f811400b541c-client-ca\") pod \"controller-manager-879f6c89f-x5nrv\" (UID: \"08f97853-1190-438d-91b7-f811400b541c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.738481 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5db5x"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.738918 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jzbkj"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.739144 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lfhcp"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.739473 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-lfhcp" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.739512 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jzbkj" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.739616 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5db5x" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.735827 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.738498 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.739415 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.739446 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.744062 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08f97853-1190-438d-91b7-f811400b541c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x5nrv\" (UID: \"08f97853-1190-438d-91b7-f811400b541c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.745235 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.745350 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b3cea192-f8e9-426c-887e-68a8d8f2dad5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7hvsq\" (UID: \"b3cea192-f8e9-426c-887e-68a8d8f2dad5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7hvsq" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.745566 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.746357 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.747421 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.749362 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.749463 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.749613 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r2xb4"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.750179 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.750364 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08f97853-1190-438d-91b7-f811400b541c-config\") pod \"controller-manager-879f6c89f-x5nrv\" (UID: \"08f97853-1190-438d-91b7-f811400b541c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.751394 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pmswz"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.752005 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnfts"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.752090 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-pmswz" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.752137 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08f97853-1190-438d-91b7-f811400b541c-serving-cert\") pod \"controller-manager-879f6c89f-x5nrv\" (UID: \"08f97853-1190-438d-91b7-f811400b541c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.753685 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3cea192-f8e9-426c-887e-68a8d8f2dad5-config\") pod \"machine-api-operator-5694c8668f-7hvsq\" (UID: \"b3cea192-f8e9-426c-887e-68a8d8f2dad5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7hvsq" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.755410 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b3cea192-f8e9-426c-887e-68a8d8f2dad5-images\") pod \"machine-api-operator-5694c8668f-7hvsq\" (UID: \"b3cea192-f8e9-426c-887e-68a8d8f2dad5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7hvsq" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.755431 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.757134 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.757699 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dmxjs"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.758771 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.761452 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.761534 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vhgpr"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.761758 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.764360 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.764913 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.766406 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.766827 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dmxjs" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.766866 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.767652 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.768139 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.769297 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.769623 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.769933 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.772495 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.772776 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.776011 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhgpr" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.776408 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bk8lv"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.777484 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bk8lv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.777633 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zt2mx"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.778445 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.778517 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zt2mx" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.779409 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgdp9\" (UniqueName: \"kubernetes.io/projected/b3cea192-f8e9-426c-887e-68a8d8f2dad5-kube-api-access-xgdp9\") pod \"machine-api-operator-5694c8668f-7hvsq\" (UID: \"b3cea192-f8e9-426c-887e-68a8d8f2dad5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7hvsq" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.785662 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vng26"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.786312 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.786928 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vng26" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.792885 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-jfp2b"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.795872 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-sqfnk"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.796479 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jfp2b" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.805513 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hzwl\" (UniqueName: \"kubernetes.io/projected/08f97853-1190-438d-91b7-f811400b541c-kube-api-access-2hzwl\") pod \"controller-manager-879f6c89f-x5nrv\" (UID: \"08f97853-1190-438d-91b7-f811400b541c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.808191 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7dfj7"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.808424 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.808910 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7dfj7" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.809775 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fww5f"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.811195 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fww5f" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.812591 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmvk6"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.813246 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmvk6" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.813644 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rp8vt"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.814677 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9rmtb"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.815948 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9rmtb" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.823137 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qwqs9"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.824819 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331060-f6rbn"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.825071 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qwqs9" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.825315 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331060-f6rbn" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.825442 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b8qt8"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.826091 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b8qt8" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.828402 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qwwdm"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.828995 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8xv6b"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.829128 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.829320 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qwwdm" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.829366 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mdzv5"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.829965 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8xv6b" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.830389 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-254wk"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.830972 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-254wk" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.831189 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-g8q6q"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.831679 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g8q6q" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.831980 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mhcpj"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.832595 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mhcpj" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.832922 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ll4q5"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.833856 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-x74mv"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.834722 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-qbvdj"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.835159 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qbvdj" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.836613 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.837204 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc20bdf9-7763-495f-b56e-1bf9ff56686e-proxy-tls\") pod \"machine-config-controller-84d6567774-vhgpr\" (UID: \"dc20bdf9-7763-495f-b56e-1bf9ff56686e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhgpr" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.837312 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0581c024-1217-4c9d-b927-45a4327e8eec-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mdzv5\" (UID: \"0581c024-1217-4c9d-b927-45a4327e8eec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mdzv5" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.837387 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjczj\" (UniqueName: \"kubernetes.io/projected/0581c024-1217-4c9d-b927-45a4327e8eec-kube-api-access-cjczj\") pod \"cluster-image-registry-operator-dc59b4c8b-mdzv5\" (UID: \"0581c024-1217-4c9d-b927-45a4327e8eec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mdzv5" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.837466 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c60c8b3-aa32-442a-a222-3ed689f4dd61-config\") pod \"console-operator-58897d9998-ll4q5\" (UID: \"5c60c8b3-aa32-442a-a222-3ed689f4dd61\") " pod="openshift-console-operator/console-operator-58897d9998-ll4q5" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.837541 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.837642 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e3cbe84-8ae0-4019-9656-3db9415aee73-serving-cert\") pod \"etcd-operator-b45778765-lfhcp\" (UID: \"4e3cbe84-8ae0-4019-9656-3db9415aee73\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lfhcp" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.837714 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1d77404b-ecd2-497c-9f7c-ec1ff470755e-image-import-ca\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.837785 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e92d41cb-ecb1-461c-bdec-314b33ae9d36-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jzbkj\" (UID: \"e92d41cb-ecb1-461c-bdec-314b33ae9d36\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jzbkj" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.837867 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbf5fc3f-5709-48d6-a0d0-b8406a396b00-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5db5x\" (UID: \"dbf5fc3f-5709-48d6-a0d0-b8406a396b00\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5db5x" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.837976 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwtps\" (UniqueName: \"kubernetes.io/projected/12a12314-9e91-4b18-b2d6-f41489add427-kube-api-access-kwtps\") pod \"cluster-samples-operator-665b6dd947-wnfts\" (UID: \"12a12314-9e91-4b18-b2d6-f41489add427\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnfts" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.837675 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7rf45"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.838217 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3b4f007-8217-4308-996e-394b0c3d072c-client-ca\") pod \"route-controller-manager-6576b87f9c-qr8n7\" (UID: \"a3b4f007-8217-4308-996e-394b0c3d072c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.838321 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4e3cbe84-8ae0-4019-9656-3db9415aee73-etcd-ca\") pod \"etcd-operator-b45778765-lfhcp\" (UID: \"4e3cbe84-8ae0-4019-9656-3db9415aee73\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lfhcp" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.838405 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.838522 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/21bd5368-2631-4c6c-94cf-d6e64b1dd657-console-config\") pod \"console-f9d7485db-sqfnk\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.838590 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3b4f007-8217-4308-996e-394b0c3d072c-serving-cert\") pod \"route-controller-manager-6576b87f9c-qr8n7\" (UID: \"a3b4f007-8217-4308-996e-394b0c3d072c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.838653 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0581c024-1217-4c9d-b927-45a4327e8eec-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mdzv5\" (UID: \"0581c024-1217-4c9d-b927-45a4327e8eec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mdzv5" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.838718 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d4n7\" (UniqueName: \"kubernetes.io/projected/a3b4f007-8217-4308-996e-394b0c3d072c-kube-api-access-2d4n7\") pod \"route-controller-manager-6576b87f9c-qr8n7\" (UID: \"a3b4f007-8217-4308-996e-394b0c3d072c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.838789 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e3cbe84-8ae0-4019-9656-3db9415aee73-config\") pod \"etcd-operator-b45778765-lfhcp\" (UID: \"4e3cbe84-8ae0-4019-9656-3db9415aee73\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lfhcp" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.839643 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kh7p\" (UniqueName: \"kubernetes.io/projected/dbf5fc3f-5709-48d6-a0d0-b8406a396b00-kube-api-access-6kh7p\") pod \"openshift-controller-manager-operator-756b6f6bc6-5db5x\" (UID: \"dbf5fc3f-5709-48d6-a0d0-b8406a396b00\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5db5x" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.839769 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a07b3c13-9b79-45d7-a759-e9c119bbe37b-serving-cert\") pod \"apiserver-7bbb656c7d-dh96g\" (UID: \"a07b3c13-9b79-45d7-a759-e9c119bbe37b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.839838 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vntx8\" (UniqueName: \"kubernetes.io/projected/f9005f09-0f66-4541-8cb0-725ba2f4380d-kube-api-access-vntx8\") pod \"downloads-7954f5f757-hpckv\" (UID: \"f9005f09-0f66-4541-8cb0-725ba2f4380d\") " pod="openshift-console/downloads-7954f5f757-hpckv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.839917 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d9758819-ad32-401c-a327-bb0dd9740946-auth-proxy-config\") pod \"machine-approver-56656f9798-lnxqj\" (UID: \"d9758819-ad32-401c-a327-bb0dd9740946\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lnxqj" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.839990 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4e3cbe84-8ae0-4019-9656-3db9415aee73-etcd-service-ca\") pod \"etcd-operator-b45778765-lfhcp\" (UID: \"4e3cbe84-8ae0-4019-9656-3db9415aee73\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lfhcp" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.840056 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.840128 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac79181c-997a-4974-9fdd-aea6b3f2903c-metrics-tls\") pod \"dns-operator-744455d44c-pmswz\" (UID: \"ac79181c-997a-4974-9fdd-aea6b3f2903c\") " pod="openshift-dns-operator/dns-operator-744455d44c-pmswz" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.839622 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dx9x2"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.840222 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0581c024-1217-4c9d-b927-45a4327e8eec-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mdzv5\" (UID: \"0581c024-1217-4c9d-b927-45a4327e8eec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mdzv5" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.840319 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.840392 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/21bd5368-2631-4c6c-94cf-d6e64b1dd657-service-ca\") pod \"console-f9d7485db-sqfnk\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.840460 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/21bd5368-2631-4c6c-94cf-d6e64b1dd657-oauth-serving-cert\") pod \"console-f9d7485db-sqfnk\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.840525 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3b4f007-8217-4308-996e-394b0c3d072c-config\") pod \"route-controller-manager-6576b87f9c-qr8n7\" (UID: \"a3b4f007-8217-4308-996e-394b0c3d072c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.840629 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jbxr\" (UniqueName: \"kubernetes.io/projected/4e3cbe84-8ae0-4019-9656-3db9415aee73-kube-api-access-9jbxr\") pod \"etcd-operator-b45778765-lfhcp\" (UID: \"4e3cbe84-8ae0-4019-9656-3db9415aee73\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lfhcp" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.840743 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb91d816-c309-4f6c-96b3-79ae595907f7-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bk8lv\" (UID: \"bb91d816-c309-4f6c-96b3-79ae595907f7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bk8lv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.840799 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jzbkj"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.840835 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5db5x"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.840902 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dx9x2" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.840975 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e92d41cb-ecb1-461c-bdec-314b33ae9d36-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jzbkj\" (UID: \"e92d41cb-ecb1-461c-bdec-314b33ae9d36\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jzbkj" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.841049 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltpk4\" (UniqueName: \"kubernetes.io/projected/dc20bdf9-7763-495f-b56e-1bf9ff56686e-kube-api-access-ltpk4\") pod \"machine-config-controller-84d6567774-vhgpr\" (UID: \"dc20bdf9-7763-495f-b56e-1bf9ff56686e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhgpr" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.839608 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/21bd5368-2631-4c6c-94cf-d6e64b1dd657-console-config\") pod \"console-f9d7485db-sqfnk\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.841305 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhcmk\" (UniqueName: \"kubernetes.io/projected/5c60c8b3-aa32-442a-a222-3ed689f4dd61-kube-api-access-nhcmk\") pod \"console-operator-58897d9998-ll4q5\" (UID: \"5c60c8b3-aa32-442a-a222-3ed689f4dd61\") " pod="openshift-console-operator/console-operator-58897d9998-ll4q5" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.841939 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21bd5368-2631-4c6c-94cf-d6e64b1dd657-trusted-ca-bundle\") pod \"console-f9d7485db-sqfnk\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.842030 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.838815 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1d77404b-ecd2-497c-9f7c-ec1ff470755e-image-import-ca\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.838976 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3b4f007-8217-4308-996e-394b0c3d072c-client-ca\") pod \"route-controller-manager-6576b87f9c-qr8n7\" (UID: \"a3b4f007-8217-4308-996e-394b0c3d072c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.842073 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d9758819-ad32-401c-a327-bb0dd9740946-auth-proxy-config\") pod \"machine-approver-56656f9798-lnxqj\" (UID: \"d9758819-ad32-401c-a327-bb0dd9740946\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lnxqj" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.842140 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/21bd5368-2631-4c6c-94cf-d6e64b1dd657-service-ca\") pod \"console-f9d7485db-sqfnk\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.842198 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3b4f007-8217-4308-996e-394b0c3d072c-serving-cert\") pod \"route-controller-manager-6576b87f9c-qr8n7\" (UID: \"a3b4f007-8217-4308-996e-394b0c3d072c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.838434 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c60c8b3-aa32-442a-a222-3ed689f4dd61-config\") pod \"console-operator-58897d9998-ll4q5\" (UID: \"5c60c8b3-aa32-442a-a222-3ed689f4dd61\") " pod="openshift-console-operator/console-operator-58897d9998-ll4q5" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.839685 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7rf45" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.842471 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/21bd5368-2631-4c6c-94cf-d6e64b1dd657-oauth-serving-cert\") pod \"console-f9d7485db-sqfnk\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.842573 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7v9z\" (UniqueName: \"kubernetes.io/projected/8c611070-b4f9-4b32-b436-d8c94d8b09df-kube-api-access-k7v9z\") pod \"olm-operator-6b444d44fb-zt2mx\" (UID: \"8c611070-b4f9-4b32-b436-d8c94d8b09df\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zt2mx" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.842752 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/12a12314-9e91-4b18-b2d6-f41489add427-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wnfts\" (UID: \"12a12314-9e91-4b18-b2d6-f41489add427\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnfts" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843010 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.842681 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843017 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb91d816-c309-4f6c-96b3-79ae595907f7-config\") pod \"kube-apiserver-operator-766d6c64bb-bk8lv\" (UID: \"bb91d816-c309-4f6c-96b3-79ae595907f7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bk8lv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843264 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21bd5368-2631-4c6c-94cf-d6e64b1dd657-trusted-ca-bundle\") pod \"console-f9d7485db-sqfnk\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843316 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4e3cbe84-8ae0-4019-9656-3db9415aee73-etcd-client\") pod \"etcd-operator-b45778765-lfhcp\" (UID: \"4e3cbe84-8ae0-4019-9656-3db9415aee73\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lfhcp" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.842716 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3b4f007-8217-4308-996e-394b0c3d072c-config\") pod \"route-controller-manager-6576b87f9c-qr8n7\" (UID: \"a3b4f007-8217-4308-996e-394b0c3d072c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843370 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8c611070-b4f9-4b32-b436-d8c94d8b09df-srv-cert\") pod \"olm-operator-6b444d44fb-zt2mx\" (UID: \"8c611070-b4f9-4b32-b436-d8c94d8b09df\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zt2mx" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843405 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d9758819-ad32-401c-a327-bb0dd9740946-machine-approver-tls\") pod \"machine-approver-56656f9798-lnxqj\" (UID: \"d9758819-ad32-401c-a327-bb0dd9740946\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lnxqj" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843454 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a07b3c13-9b79-45d7-a759-e9c119bbe37b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dh96g\" (UID: \"a07b3c13-9b79-45d7-a759-e9c119bbe37b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843507 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843480 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843608 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bllp8\" (UniqueName: \"kubernetes.io/projected/18b317fe-8d88-4915-bc98-89f42c8d4484-kube-api-access-bllp8\") pod \"ingress-operator-5b745b69d9-dmxjs\" (UID: \"18b317fe-8d88-4915-bc98-89f42c8d4484\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dmxjs" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843635 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a07b3c13-9b79-45d7-a759-e9c119bbe37b-encryption-config\") pod \"apiserver-7bbb656c7d-dh96g\" (UID: \"a07b3c13-9b79-45d7-a759-e9c119bbe37b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843653 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-audit-dir\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843673 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkx4z\" (UniqueName: \"kubernetes.io/projected/a07b3c13-9b79-45d7-a759-e9c119bbe37b-kube-api-access-dkx4z\") pod \"apiserver-7bbb656c7d-dh96g\" (UID: \"a07b3c13-9b79-45d7-a759-e9c119bbe37b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843692 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1d77404b-ecd2-497c-9f7c-ec1ff470755e-etcd-client\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843710 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td5m8\" (UniqueName: \"kubernetes.io/projected/d9758819-ad32-401c-a327-bb0dd9740946-kube-api-access-td5m8\") pod \"machine-approver-56656f9798-lnxqj\" (UID: \"d9758819-ad32-401c-a327-bb0dd9740946\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lnxqj" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843719 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-audit-dir\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843733 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a07b3c13-9b79-45d7-a759-e9c119bbe37b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dh96g\" (UID: \"a07b3c13-9b79-45d7-a759-e9c119bbe37b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843752 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843774 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c60c8b3-aa32-442a-a222-3ed689f4dd61-serving-cert\") pod \"console-operator-58897d9998-ll4q5\" (UID: \"5c60c8b3-aa32-442a-a222-3ed689f4dd61\") " pod="openshift-console-operator/console-operator-58897d9998-ll4q5" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843792 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a07b3c13-9b79-45d7-a759-e9c119bbe37b-serving-cert\") pod \"apiserver-7bbb656c7d-dh96g\" (UID: \"a07b3c13-9b79-45d7-a759-e9c119bbe37b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843795 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1d77404b-ecd2-497c-9f7c-ec1ff470755e-audit\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843839 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d77404b-ecd2-497c-9f7c-ec1ff470755e-serving-cert\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843859 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843878 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843897 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jvbb\" (UniqueName: \"kubernetes.io/projected/e92d41cb-ecb1-461c-bdec-314b33ae9d36-kube-api-access-6jvbb\") pod \"openshift-apiserver-operator-796bbdcf4f-jzbkj\" (UID: \"e92d41cb-ecb1-461c-bdec-314b33ae9d36\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jzbkj" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843917 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0581c024-1217-4c9d-b927-45a4327e8eec-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mdzv5\" (UID: \"0581c024-1217-4c9d-b927-45a4327e8eec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mdzv5" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843934 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a07b3c13-9b79-45d7-a759-e9c119bbe37b-audit-policies\") pod \"apiserver-7bbb656c7d-dh96g\" (UID: \"a07b3c13-9b79-45d7-a759-e9c119bbe37b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843978 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5ps8\" (UniqueName: \"kubernetes.io/projected/ac79181c-997a-4974-9fdd-aea6b3f2903c-kube-api-access-w5ps8\") pod \"dns-operator-744455d44c-pmswz\" (UID: \"ac79181c-997a-4974-9fdd-aea6b3f2903c\") " pod="openshift-dns-operator/dns-operator-744455d44c-pmswz" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.843996 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1d77404b-ecd2-497c-9f7c-ec1ff470755e-encryption-config\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844014 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9758819-ad32-401c-a327-bb0dd9740946-config\") pod \"machine-approver-56656f9798-lnxqj\" (UID: \"d9758819-ad32-401c-a327-bb0dd9740946\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lnxqj" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844031 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844048 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvljb\" (UniqueName: \"kubernetes.io/projected/21bd5368-2631-4c6c-94cf-d6e64b1dd657-kube-api-access-fvljb\") pod \"console-f9d7485db-sqfnk\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844075 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a07b3c13-9b79-45d7-a759-e9c119bbe37b-etcd-client\") pod \"apiserver-7bbb656c7d-dh96g\" (UID: \"a07b3c13-9b79-45d7-a759-e9c119bbe37b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844089 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d77404b-ecd2-497c-9f7c-ec1ff470755e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844105 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb91d816-c309-4f6c-96b3-79ae595907f7-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bk8lv\" (UID: \"bb91d816-c309-4f6c-96b3-79ae595907f7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bk8lv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844119 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8c611070-b4f9-4b32-b436-d8c94d8b09df-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zt2mx\" (UID: \"8c611070-b4f9-4b32-b436-d8c94d8b09df\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zt2mx" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844139 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/18b317fe-8d88-4915-bc98-89f42c8d4484-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dmxjs\" (UID: \"18b317fe-8d88-4915-bc98-89f42c8d4484\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dmxjs" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844156 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1d77404b-ecd2-497c-9f7c-ec1ff470755e-node-pullsecrets\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844172 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-audit-policies\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844189 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dc20bdf9-7763-495f-b56e-1bf9ff56686e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vhgpr\" (UID: \"dc20bdf9-7763-495f-b56e-1bf9ff56686e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhgpr" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844204 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/21bd5368-2631-4c6c-94cf-d6e64b1dd657-console-oauth-config\") pod \"console-f9d7485db-sqfnk\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844206 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1d77404b-ecd2-497c-9f7c-ec1ff470755e-audit\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844221 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d77404b-ecd2-497c-9f7c-ec1ff470755e-audit-dir\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844254 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2t28\" (UniqueName: \"kubernetes.io/projected/1d77404b-ecd2-497c-9f7c-ec1ff470755e-kube-api-access-l2t28\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844271 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/21bd5368-2631-4c6c-94cf-d6e64b1dd657-console-serving-cert\") pod \"console-f9d7485db-sqfnk\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844287 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18b317fe-8d88-4915-bc98-89f42c8d4484-metrics-tls\") pod \"ingress-operator-5b745b69d9-dmxjs\" (UID: \"18b317fe-8d88-4915-bc98-89f42c8d4484\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dmxjs" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844302 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbf5fc3f-5709-48d6-a0d0-b8406a396b00-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5db5x\" (UID: \"dbf5fc3f-5709-48d6-a0d0-b8406a396b00\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5db5x" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844320 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d77404b-ecd2-497c-9f7c-ec1ff470755e-config\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844338 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1d77404b-ecd2-497c-9f7c-ec1ff470755e-etcd-serving-ca\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844357 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a07b3c13-9b79-45d7-a759-e9c119bbe37b-audit-dir\") pod \"apiserver-7bbb656c7d-dh96g\" (UID: \"a07b3c13-9b79-45d7-a759-e9c119bbe37b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844373 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c60c8b3-aa32-442a-a222-3ed689f4dd61-trusted-ca\") pod \"console-operator-58897d9998-ll4q5\" (UID: \"5c60c8b3-aa32-442a-a222-3ed689f4dd61\") " pod="openshift-console-operator/console-operator-58897d9998-ll4q5" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844389 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzrp8\" (UniqueName: \"kubernetes.io/projected/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-kube-api-access-tzrp8\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844402 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844407 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/18b317fe-8d88-4915-bc98-89f42c8d4484-trusted-ca\") pod \"ingress-operator-5b745b69d9-dmxjs\" (UID: \"18b317fe-8d88-4915-bc98-89f42c8d4484\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dmxjs" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.844444 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.845204 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a07b3c13-9b79-45d7-a759-e9c119bbe37b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dh96g\" (UID: \"a07b3c13-9b79-45d7-a759-e9c119bbe37b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.845423 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.845462 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pmswz"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.845622 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d9758819-ad32-401c-a327-bb0dd9740946-machine-approver-tls\") pod \"machine-approver-56656f9798-lnxqj\" (UID: \"d9758819-ad32-401c-a327-bb0dd9740946\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lnxqj" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.845725 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a07b3c13-9b79-45d7-a759-e9c119bbe37b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dh96g\" (UID: \"a07b3c13-9b79-45d7-a759-e9c119bbe37b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.846186 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.846739 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9758819-ad32-401c-a327-bb0dd9740946-config\") pod \"machine-approver-56656f9798-lnxqj\" (UID: \"d9758819-ad32-401c-a327-bb0dd9740946\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lnxqj" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.846908 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.847080 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d77404b-ecd2-497c-9f7c-ec1ff470755e-serving-cert\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.847129 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a07b3c13-9b79-45d7-a759-e9c119bbe37b-audit-dir\") pod \"apiserver-7bbb656c7d-dh96g\" (UID: \"a07b3c13-9b79-45d7-a759-e9c119bbe37b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.847331 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/12a12314-9e91-4b18-b2d6-f41489add427-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wnfts\" (UID: \"12a12314-9e91-4b18-b2d6-f41489add427\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnfts" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.847515 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1d77404b-ecd2-497c-9f7c-ec1ff470755e-node-pullsecrets\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.847534 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1d77404b-ecd2-497c-9f7c-ec1ff470755e-etcd-serving-ca\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.847680 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.847804 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.847879 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d77404b-ecd2-497c-9f7c-ec1ff470755e-audit-dir\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.847992 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c60c8b3-aa32-442a-a222-3ed689f4dd61-trusted-ca\") pod \"console-operator-58897d9998-ll4q5\" (UID: \"5c60c8b3-aa32-442a-a222-3ed689f4dd61\") " pod="openshift-console-operator/console-operator-58897d9998-ll4q5" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.848014 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-audit-policies\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.848063 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d77404b-ecd2-497c-9f7c-ec1ff470755e-config\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.848328 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.848405 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.848441 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a07b3c13-9b79-45d7-a759-e9c119bbe37b-audit-policies\") pod \"apiserver-7bbb656c7d-dh96g\" (UID: \"a07b3c13-9b79-45d7-a759-e9c119bbe37b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.848499 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r2xb4"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.848765 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a07b3c13-9b79-45d7-a759-e9c119bbe37b-encryption-config\") pod \"apiserver-7bbb656c7d-dh96g\" (UID: \"a07b3c13-9b79-45d7-a759-e9c119bbe37b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.849241 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1d77404b-ecd2-497c-9f7c-ec1ff470755e-etcd-client\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.849335 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d77404b-ecd2-497c-9f7c-ec1ff470755e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.849474 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/21bd5368-2631-4c6c-94cf-d6e64b1dd657-console-serving-cert\") pod \"console-f9d7485db-sqfnk\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.849796 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a07b3c13-9b79-45d7-a759-e9c119bbe37b-etcd-client\") pod \"apiserver-7bbb656c7d-dh96g\" (UID: \"a07b3c13-9b79-45d7-a759-e9c119bbe37b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.849824 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vhgpr"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.850459 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/21bd5368-2631-4c6c-94cf-d6e64b1dd657-console-oauth-config\") pod \"console-f9d7485db-sqfnk\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.850524 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.850716 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wrcjq"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.851543 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1d77404b-ecd2-497c-9f7c-ec1ff470755e-encryption-config\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.851624 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7dfj7"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.851689 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0581c024-1217-4c9d-b927-45a4327e8eec-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mdzv5\" (UID: \"0581c024-1217-4c9d-b927-45a4327e8eec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mdzv5" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.852454 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c60c8b3-aa32-442a-a222-3ed689f4dd61-serving-cert\") pod \"console-operator-58897d9998-ll4q5\" (UID: \"5c60c8b3-aa32-442a-a222-3ed689f4dd61\") " pod="openshift-console-operator/console-operator-58897d9998-ll4q5" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.852492 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vng26"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.853479 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-g8q6q"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.854479 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9rmtb"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.855695 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qwqs9"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.856837 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zt2mx"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.858004 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mnjlm"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.860288 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-mxsfh"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.860443 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.861895 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mxsfh" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.862325 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bk8lv"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.863693 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lfhcp"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.865134 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7rf45"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.870723 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmvk6"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.870802 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dx9x2"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.873672 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b8qt8"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.877278 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dmxjs"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.878954 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fww5f"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.880103 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mxsfh"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.881341 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qwwdm"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.882425 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mnjlm"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.883454 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-254wk"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.884466 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8xv6b"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.886043 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.889280 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331060-f6rbn"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.890383 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mhcpj"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.891577 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lmk59"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.892593 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lmk59"] Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.892721 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lmk59" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.906449 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.926960 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.931655 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945201 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18b317fe-8d88-4915-bc98-89f42c8d4484-metrics-tls\") pod \"ingress-operator-5b745b69d9-dmxjs\" (UID: \"18b317fe-8d88-4915-bc98-89f42c8d4484\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dmxjs" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945246 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbf5fc3f-5709-48d6-a0d0-b8406a396b00-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5db5x\" (UID: \"dbf5fc3f-5709-48d6-a0d0-b8406a396b00\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5db5x" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945276 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/18b317fe-8d88-4915-bc98-89f42c8d4484-trusted-ca\") pod \"ingress-operator-5b745b69d9-dmxjs\" (UID: \"18b317fe-8d88-4915-bc98-89f42c8d4484\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dmxjs" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945314 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e3cbe84-8ae0-4019-9656-3db9415aee73-serving-cert\") pod \"etcd-operator-b45778765-lfhcp\" (UID: \"4e3cbe84-8ae0-4019-9656-3db9415aee73\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lfhcp" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945331 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc20bdf9-7763-495f-b56e-1bf9ff56686e-proxy-tls\") pod \"machine-config-controller-84d6567774-vhgpr\" (UID: \"dc20bdf9-7763-495f-b56e-1bf9ff56686e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhgpr" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945364 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e92d41cb-ecb1-461c-bdec-314b33ae9d36-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jzbkj\" (UID: \"e92d41cb-ecb1-461c-bdec-314b33ae9d36\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jzbkj" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945384 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4e3cbe84-8ae0-4019-9656-3db9415aee73-etcd-ca\") pod \"etcd-operator-b45778765-lfhcp\" (UID: \"4e3cbe84-8ae0-4019-9656-3db9415aee73\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lfhcp" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945402 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbf5fc3f-5709-48d6-a0d0-b8406a396b00-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5db5x\" (UID: \"dbf5fc3f-5709-48d6-a0d0-b8406a396b00\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5db5x" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945440 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e3cbe84-8ae0-4019-9656-3db9415aee73-config\") pod \"etcd-operator-b45778765-lfhcp\" (UID: \"4e3cbe84-8ae0-4019-9656-3db9415aee73\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lfhcp" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945464 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kh7p\" (UniqueName: \"kubernetes.io/projected/dbf5fc3f-5709-48d6-a0d0-b8406a396b00-kube-api-access-6kh7p\") pod \"openshift-controller-manager-operator-756b6f6bc6-5db5x\" (UID: \"dbf5fc3f-5709-48d6-a0d0-b8406a396b00\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5db5x" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945494 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4e3cbe84-8ae0-4019-9656-3db9415aee73-etcd-service-ca\") pod \"etcd-operator-b45778765-lfhcp\" (UID: \"4e3cbe84-8ae0-4019-9656-3db9415aee73\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lfhcp" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945512 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac79181c-997a-4974-9fdd-aea6b3f2903c-metrics-tls\") pod \"dns-operator-744455d44c-pmswz\" (UID: \"ac79181c-997a-4974-9fdd-aea6b3f2903c\") " pod="openshift-dns-operator/dns-operator-744455d44c-pmswz" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945531 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jbxr\" (UniqueName: \"kubernetes.io/projected/4e3cbe84-8ae0-4019-9656-3db9415aee73-kube-api-access-9jbxr\") pod \"etcd-operator-b45778765-lfhcp\" (UID: \"4e3cbe84-8ae0-4019-9656-3db9415aee73\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lfhcp" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945548 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb91d816-c309-4f6c-96b3-79ae595907f7-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bk8lv\" (UID: \"bb91d816-c309-4f6c-96b3-79ae595907f7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bk8lv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945572 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e92d41cb-ecb1-461c-bdec-314b33ae9d36-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jzbkj\" (UID: \"e92d41cb-ecb1-461c-bdec-314b33ae9d36\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jzbkj" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945594 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltpk4\" (UniqueName: \"kubernetes.io/projected/dc20bdf9-7763-495f-b56e-1bf9ff56686e-kube-api-access-ltpk4\") pod \"machine-config-controller-84d6567774-vhgpr\" (UID: \"dc20bdf9-7763-495f-b56e-1bf9ff56686e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhgpr" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945623 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7v9z\" (UniqueName: \"kubernetes.io/projected/8c611070-b4f9-4b32-b436-d8c94d8b09df-kube-api-access-k7v9z\") pod \"olm-operator-6b444d44fb-zt2mx\" (UID: \"8c611070-b4f9-4b32-b436-d8c94d8b09df\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zt2mx" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945641 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb91d816-c309-4f6c-96b3-79ae595907f7-config\") pod \"kube-apiserver-operator-766d6c64bb-bk8lv\" (UID: \"bb91d816-c309-4f6c-96b3-79ae595907f7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bk8lv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945658 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4e3cbe84-8ae0-4019-9656-3db9415aee73-etcd-client\") pod \"etcd-operator-b45778765-lfhcp\" (UID: \"4e3cbe84-8ae0-4019-9656-3db9415aee73\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lfhcp" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945676 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8c611070-b4f9-4b32-b436-d8c94d8b09df-srv-cert\") pod \"olm-operator-6b444d44fb-zt2mx\" (UID: \"8c611070-b4f9-4b32-b436-d8c94d8b09df\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zt2mx" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945699 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bllp8\" (UniqueName: \"kubernetes.io/projected/18b317fe-8d88-4915-bc98-89f42c8d4484-kube-api-access-bllp8\") pod \"ingress-operator-5b745b69d9-dmxjs\" (UID: \"18b317fe-8d88-4915-bc98-89f42c8d4484\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dmxjs" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945737 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jvbb\" (UniqueName: \"kubernetes.io/projected/e92d41cb-ecb1-461c-bdec-314b33ae9d36-kube-api-access-6jvbb\") pod \"openshift-apiserver-operator-796bbdcf4f-jzbkj\" (UID: \"e92d41cb-ecb1-461c-bdec-314b33ae9d36\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jzbkj" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945756 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5ps8\" (UniqueName: \"kubernetes.io/projected/ac79181c-997a-4974-9fdd-aea6b3f2903c-kube-api-access-w5ps8\") pod \"dns-operator-744455d44c-pmswz\" (UID: \"ac79181c-997a-4974-9fdd-aea6b3f2903c\") " pod="openshift-dns-operator/dns-operator-744455d44c-pmswz" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945790 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb91d816-c309-4f6c-96b3-79ae595907f7-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bk8lv\" (UID: \"bb91d816-c309-4f6c-96b3-79ae595907f7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bk8lv" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945807 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8c611070-b4f9-4b32-b436-d8c94d8b09df-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zt2mx\" (UID: \"8c611070-b4f9-4b32-b436-d8c94d8b09df\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zt2mx" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945825 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dc20bdf9-7763-495f-b56e-1bf9ff56686e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vhgpr\" (UID: \"dc20bdf9-7763-495f-b56e-1bf9ff56686e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhgpr" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.945841 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/18b317fe-8d88-4915-bc98-89f42c8d4484-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dmxjs\" (UID: \"18b317fe-8d88-4915-bc98-89f42c8d4484\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dmxjs" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.946380 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e92d41cb-ecb1-461c-bdec-314b33ae9d36-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jzbkj\" (UID: \"e92d41cb-ecb1-461c-bdec-314b33ae9d36\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jzbkj" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.946891 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4e3cbe84-8ae0-4019-9656-3db9415aee73-etcd-ca\") pod \"etcd-operator-b45778765-lfhcp\" (UID: \"4e3cbe84-8ae0-4019-9656-3db9415aee73\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lfhcp" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.947220 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.948427 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbf5fc3f-5709-48d6-a0d0-b8406a396b00-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5db5x\" (UID: \"dbf5fc3f-5709-48d6-a0d0-b8406a396b00\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5db5x" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.948452 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e3cbe84-8ae0-4019-9656-3db9415aee73-config\") pod \"etcd-operator-b45778765-lfhcp\" (UID: \"4e3cbe84-8ae0-4019-9656-3db9415aee73\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lfhcp" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.948811 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4e3cbe84-8ae0-4019-9656-3db9415aee73-etcd-service-ca\") pod \"etcd-operator-b45778765-lfhcp\" (UID: \"4e3cbe84-8ae0-4019-9656-3db9415aee73\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lfhcp" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.949030 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e92d41cb-ecb1-461c-bdec-314b33ae9d36-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jzbkj\" (UID: \"e92d41cb-ecb1-461c-bdec-314b33ae9d36\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jzbkj" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.949399 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dc20bdf9-7763-495f-b56e-1bf9ff56686e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vhgpr\" (UID: \"dc20bdf9-7763-495f-b56e-1bf9ff56686e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhgpr" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.950019 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbf5fc3f-5709-48d6-a0d0-b8406a396b00-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5db5x\" (UID: \"dbf5fc3f-5709-48d6-a0d0-b8406a396b00\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5db5x" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.950404 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e3cbe84-8ae0-4019-9656-3db9415aee73-serving-cert\") pod \"etcd-operator-b45778765-lfhcp\" (UID: \"4e3cbe84-8ae0-4019-9656-3db9415aee73\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lfhcp" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.953195 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4e3cbe84-8ae0-4019-9656-3db9415aee73-etcd-client\") pod \"etcd-operator-b45778765-lfhcp\" (UID: \"4e3cbe84-8ae0-4019-9656-3db9415aee73\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lfhcp" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.957469 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7hvsq" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.967077 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 07 19:02:40 crc kubenswrapper[4825]: I1007 19:02:40.986256 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.000828 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac79181c-997a-4974-9fdd-aea6b3f2903c-metrics-tls\") pod \"dns-operator-744455d44c-pmswz\" (UID: \"ac79181c-997a-4974-9fdd-aea6b3f2903c\") " pod="openshift-dns-operator/dns-operator-744455d44c-pmswz" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.011971 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.026468 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.049688 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.068277 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.081548 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18b317fe-8d88-4915-bc98-89f42c8d4484-metrics-tls\") pod \"ingress-operator-5b745b69d9-dmxjs\" (UID: \"18b317fe-8d88-4915-bc98-89f42c8d4484\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dmxjs" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.088266 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.109887 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x5nrv"] Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.113268 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.116554 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/18b317fe-8d88-4915-bc98-89f42c8d4484-trusted-ca\") pod \"ingress-operator-5b745b69d9-dmxjs\" (UID: \"18b317fe-8d88-4915-bc98-89f42c8d4484\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dmxjs" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.131888 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.144773 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7hvsq"] Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.147601 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 07 19:02:41 crc kubenswrapper[4825]: W1007 19:02:41.149710 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3cea192_f8e9_426c_887e_68a8d8f2dad5.slice/crio-aa323cf455d820763061f293af2387bea6251a6cb72c803ae297627e9b153b72 WatchSource:0}: Error finding container aa323cf455d820763061f293af2387bea6251a6cb72c803ae297627e9b153b72: Status 404 returned error can't find the container with id aa323cf455d820763061f293af2387bea6251a6cb72c803ae297627e9b153b72 Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.160105 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc20bdf9-7763-495f-b56e-1bf9ff56686e-proxy-tls\") pod \"machine-config-controller-84d6567774-vhgpr\" (UID: \"dc20bdf9-7763-495f-b56e-1bf9ff56686e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhgpr" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.166916 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.180168 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb91d816-c309-4f6c-96b3-79ae595907f7-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bk8lv\" (UID: \"bb91d816-c309-4f6c-96b3-79ae595907f7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bk8lv" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.187256 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.206173 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.250712 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.252627 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.258951 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb91d816-c309-4f6c-96b3-79ae595907f7-config\") pod \"kube-apiserver-operator-766d6c64bb-bk8lv\" (UID: \"bb91d816-c309-4f6c-96b3-79ae595907f7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bk8lv" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.262421 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8c611070-b4f9-4b32-b436-d8c94d8b09df-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zt2mx\" (UID: \"8c611070-b4f9-4b32-b436-d8c94d8b09df\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zt2mx" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.267295 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.272529 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8c611070-b4f9-4b32-b436-d8c94d8b09df-srv-cert\") pod \"olm-operator-6b444d44fb-zt2mx\" (UID: \"8c611070-b4f9-4b32-b436-d8c94d8b09df\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zt2mx" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.287515 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.306878 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.326616 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.347372 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.366138 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.388468 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.406594 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.427332 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.446679 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.467002 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.487221 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.507035 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.527006 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.546662 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.586415 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.598999 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"845fa2aeede8b38c5cd0063588203533ade3247368574ee2be7a48daed218027"} Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.599056 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ba85909f010a1870a12cc105d25e2979609c1334267d259088e154812de27e0c"} Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.599325 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.601612 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7hvsq" event={"ID":"b3cea192-f8e9-426c-887e-68a8d8f2dad5","Type":"ContainerStarted","Data":"7d3f3ddabe640dac0848d3badfa572bd800dc798788164b3ce794dcba619eeb5"} Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.601721 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7hvsq" event={"ID":"b3cea192-f8e9-426c-887e-68a8d8f2dad5","Type":"ContainerStarted","Data":"e5a1077ae4df84d0c8d8fcbefcf52112dfdf9674a313bf7ae8e0ee021f704b37"} Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.601803 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7hvsq" event={"ID":"b3cea192-f8e9-426c-887e-68a8d8f2dad5","Type":"ContainerStarted","Data":"aa323cf455d820763061f293af2387bea6251a6cb72c803ae297627e9b153b72"} Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.603478 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"5015fc7ed5b69481a246daf92fa3a3f732f5bfd01f1dac53c721b47a7b41591e"} Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.603584 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a8d501cceccb12ea3195f5ff5c4040ddbcdb6c005e200611b95cac4e454bf6fb"} Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.605556 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" event={"ID":"08f97853-1190-438d-91b7-f811400b541c","Type":"ContainerStarted","Data":"cf5b0063f5a6f724902fdbbd76997b0f4474bb0875050e0568207467b16f8f18"} Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.605612 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" event={"ID":"08f97853-1190-438d-91b7-f811400b541c","Type":"ContainerStarted","Data":"e0cc0fb1db28d9d8b837c660f9ffa3b82ba0f5975c5bce267bb11a0eb850dbca"} Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.605819 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.606669 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.607853 4825 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-x5nrv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.607969 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" podUID="08f97853-1190-438d-91b7-f811400b541c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.607850 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2e4a3480881eb3142a2dda8879de0f764813993fa14ae10bbba3eb3b0c650995"} Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.608215 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ffa5d41ad683b30199e8df9abf79a1d7043e749de1662e9b4095c6ec437b7df9"} Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.626430 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.646566 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.666614 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.699342 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.708060 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.727112 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.748017 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.767081 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.787658 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.806370 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.828121 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.844610 4825 request.go:700] Waited for 1.019011275s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dcollect-profiles-dockercfg-kzf4t&limit=500&resourceVersion=0 Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.846054 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.867406 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.887298 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.906973 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.928111 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.948321 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.966687 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 07 19:02:41 crc kubenswrapper[4825]: I1007 19:02:41.986876 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.007346 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.027112 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.047629 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.066282 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.088466 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.106979 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.126536 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.147104 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.167876 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.186563 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.208984 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.226398 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.246627 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.266972 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.287026 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.308552 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.327104 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.347644 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.366957 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.387008 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.428504 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0581c024-1217-4c9d-b927-45a4327e8eec-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mdzv5\" (UID: \"0581c024-1217-4c9d-b927-45a4327e8eec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mdzv5" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.454578 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjczj\" (UniqueName: \"kubernetes.io/projected/0581c024-1217-4c9d-b927-45a4327e8eec-kube-api-access-cjczj\") pod \"cluster-image-registry-operator-dc59b4c8b-mdzv5\" (UID: \"0581c024-1217-4c9d-b927-45a4327e8eec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mdzv5" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.482407 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwtps\" (UniqueName: \"kubernetes.io/projected/12a12314-9e91-4b18-b2d6-f41489add427-kube-api-access-kwtps\") pod \"cluster-samples-operator-665b6dd947-wnfts\" (UID: \"12a12314-9e91-4b18-b2d6-f41489add427\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnfts" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.488314 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.492520 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mdzv5" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.503053 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d4n7\" (UniqueName: \"kubernetes.io/projected/a3b4f007-8217-4308-996e-394b0c3d072c-kube-api-access-2d4n7\") pod \"route-controller-manager-6576b87f9c-qr8n7\" (UID: \"a3b4f007-8217-4308-996e-394b0c3d072c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.504459 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnfts" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.524032 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vntx8\" (UniqueName: \"kubernetes.io/projected/f9005f09-0f66-4541-8cb0-725ba2f4380d-kube-api-access-vntx8\") pod \"downloads-7954f5f757-hpckv\" (UID: \"f9005f09-0f66-4541-8cb0-725ba2f4380d\") " pod="openshift-console/downloads-7954f5f757-hpckv" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.546505 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhcmk\" (UniqueName: \"kubernetes.io/projected/5c60c8b3-aa32-442a-a222-3ed689f4dd61-kube-api-access-nhcmk\") pod \"console-operator-58897d9998-ll4q5\" (UID: \"5c60c8b3-aa32-442a-a222-3ed689f4dd61\") " pod="openshift-console-operator/console-operator-58897d9998-ll4q5" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.546998 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.567219 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.586939 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.596610 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-hpckv" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.607837 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.624629 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.653633 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkx4z\" (UniqueName: \"kubernetes.io/projected/a07b3c13-9b79-45d7-a759-e9c119bbe37b-kube-api-access-dkx4z\") pod \"apiserver-7bbb656c7d-dh96g\" (UID: \"a07b3c13-9b79-45d7-a759-e9c119bbe37b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.663163 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td5m8\" (UniqueName: \"kubernetes.io/projected/d9758819-ad32-401c-a327-bb0dd9740946-kube-api-access-td5m8\") pod \"machine-approver-56656f9798-lnxqj\" (UID: \"d9758819-ad32-401c-a327-bb0dd9740946\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lnxqj" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.692615 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzrp8\" (UniqueName: \"kubernetes.io/projected/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-kube-api-access-tzrp8\") pod \"oauth-openshift-558db77b4-x74mv\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.704868 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2t28\" (UniqueName: \"kubernetes.io/projected/1d77404b-ecd2-497c-9f7c-ec1ff470755e-kube-api-access-l2t28\") pod \"apiserver-76f77b778f-rp8vt\" (UID: \"1d77404b-ecd2-497c-9f7c-ec1ff470755e\") " pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.717561 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-ll4q5" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.727457 4825 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.742816 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvljb\" (UniqueName: \"kubernetes.io/projected/21bd5368-2631-4c6c-94cf-d6e64b1dd657-kube-api-access-fvljb\") pod \"console-f9d7485db-sqfnk\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.747138 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.766736 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.767868 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.787873 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.796785 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.797112 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.797498 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.805738 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.826699 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.832451 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mdzv5"] Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.847835 4825 request.go:700] Waited for 1.985639958s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.850390 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.861496 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lnxqj" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.861851 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.875661 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnfts"] Oct 07 19:02:42 crc kubenswrapper[4825]: W1007 19:02:42.880014 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0581c024_1217_4c9d_b927_45a4327e8eec.slice/crio-ff32dc63aa96ceadb177e5744482cc20496538c325030a58cff358ea4fc46773 WatchSource:0}: Error finding container ff32dc63aa96ceadb177e5744482cc20496538c325030a58cff358ea4fc46773: Status 404 returned error can't find the container with id ff32dc63aa96ceadb177e5744482cc20496538c325030a58cff358ea4fc46773 Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.889994 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.909541 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.930570 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.984795 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/18b317fe-8d88-4915-bc98-89f42c8d4484-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dmxjs\" (UID: \"18b317fe-8d88-4915-bc98-89f42c8d4484\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dmxjs" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.988661 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jbxr\" (UniqueName: \"kubernetes.io/projected/4e3cbe84-8ae0-4019-9656-3db9415aee73-kube-api-access-9jbxr\") pod \"etcd-operator-b45778765-lfhcp\" (UID: \"4e3cbe84-8ae0-4019-9656-3db9415aee73\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lfhcp" Oct 07 19:02:42 crc kubenswrapper[4825]: I1007 19:02:42.999743 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-hpckv"] Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.030323 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb91d816-c309-4f6c-96b3-79ae595907f7-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bk8lv\" (UID: \"bb91d816-c309-4f6c-96b3-79ae595907f7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bk8lv" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.031368 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jvbb\" (UniqueName: \"kubernetes.io/projected/e92d41cb-ecb1-461c-bdec-314b33ae9d36-kube-api-access-6jvbb\") pod \"openshift-apiserver-operator-796bbdcf4f-jzbkj\" (UID: \"e92d41cb-ecb1-461c-bdec-314b33ae9d36\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jzbkj" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.043466 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5ps8\" (UniqueName: \"kubernetes.io/projected/ac79181c-997a-4974-9fdd-aea6b3f2903c-kube-api-access-w5ps8\") pod \"dns-operator-744455d44c-pmswz\" (UID: \"ac79181c-997a-4974-9fdd-aea6b3f2903c\") " pod="openshift-dns-operator/dns-operator-744455d44c-pmswz" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.071979 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltpk4\" (UniqueName: \"kubernetes.io/projected/dc20bdf9-7763-495f-b56e-1bf9ff56686e-kube-api-access-ltpk4\") pod \"machine-config-controller-84d6567774-vhgpr\" (UID: \"dc20bdf9-7763-495f-b56e-1bf9ff56686e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhgpr" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.085661 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bllp8\" (UniqueName: \"kubernetes.io/projected/18b317fe-8d88-4915-bc98-89f42c8d4484-kube-api-access-bllp8\") pod \"ingress-operator-5b745b69d9-dmxjs\" (UID: \"18b317fe-8d88-4915-bc98-89f42c8d4484\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dmxjs" Oct 07 19:02:43 crc kubenswrapper[4825]: W1007 19:02:43.086512 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9005f09_0f66_4541_8cb0_725ba2f4380d.slice/crio-9d959043512e379e4860c497b68686c30461d645a76b39bb1bceb6f24f92ae16 WatchSource:0}: Error finding container 9d959043512e379e4860c497b68686c30461d645a76b39bb1bceb6f24f92ae16: Status 404 returned error can't find the container with id 9d959043512e379e4860c497b68686c30461d645a76b39bb1bceb6f24f92ae16 Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.111883 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7v9z\" (UniqueName: \"kubernetes.io/projected/8c611070-b4f9-4b32-b436-d8c94d8b09df-kube-api-access-k7v9z\") pod \"olm-operator-6b444d44fb-zt2mx\" (UID: \"8c611070-b4f9-4b32-b436-d8c94d8b09df\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zt2mx" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.126523 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-lfhcp" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.134409 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jzbkj" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.136179 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kh7p\" (UniqueName: \"kubernetes.io/projected/dbf5fc3f-5709-48d6-a0d0-b8406a396b00-kube-api-access-6kh7p\") pod \"openshift-controller-manager-operator-756b6f6bc6-5db5x\" (UID: \"dbf5fc3f-5709-48d6-a0d0-b8406a396b00\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5db5x" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.141736 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5db5x" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.177649 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-pmswz" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.177873 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dmxjs" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.184546 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ll4q5"] Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.185723 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f870c556-621f-4517-b1df-4e528a96f44f-service-ca-bundle\") pod \"router-default-5444994796-jfp2b\" (UID: \"f870c556-621f-4517-b1df-4e528a96f44f\") " pod="openshift-ingress/router-default-5444994796-jfp2b" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.185792 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1dc93346-dc91-4c6e-9567-0b273ed77af7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wrcjq\" (UID: \"1dc93346-dc91-4c6e-9567-0b273ed77af7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrcjq" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.185860 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3850383-5213-4cac-ae24-9f403ea96597-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vng26\" (UID: \"e3850383-5213-4cac-ae24-9f403ea96597\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vng26" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.185884 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f870c556-621f-4517-b1df-4e528a96f44f-default-certificate\") pod \"router-default-5444994796-jfp2b\" (UID: \"f870c556-621f-4517-b1df-4e528a96f44f\") " pod="openshift-ingress/router-default-5444994796-jfp2b" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.185912 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3850383-5213-4cac-ae24-9f403ea96597-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vng26\" (UID: \"e3850383-5213-4cac-ae24-9f403ea96597\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vng26" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.185957 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f870c556-621f-4517-b1df-4e528a96f44f-stats-auth\") pod \"router-default-5444994796-jfp2b\" (UID: \"f870c556-621f-4517-b1df-4e528a96f44f\") " pod="openshift-ingress/router-default-5444994796-jfp2b" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.186081 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sw7x\" (UniqueName: \"kubernetes.io/projected/f870c556-621f-4517-b1df-4e528a96f44f-kube-api-access-9sw7x\") pod \"router-default-5444994796-jfp2b\" (UID: \"f870c556-621f-4517-b1df-4e528a96f44f\") " pod="openshift-ingress/router-default-5444994796-jfp2b" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.186171 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkhrv\" (UniqueName: \"kubernetes.io/projected/1dc93346-dc91-4c6e-9567-0b273ed77af7-kube-api-access-mkhrv\") pod \"authentication-operator-69f744f599-wrcjq\" (UID: \"1dc93346-dc91-4c6e-9567-0b273ed77af7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrcjq" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.186239 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a4f51b57-041d-4009-9db3-3579fa7bb84c-registry-certificates\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.186334 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a4f51b57-041d-4009-9db3-3579fa7bb84c-bound-sa-token\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.186363 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dc93346-dc91-4c6e-9567-0b273ed77af7-config\") pod \"authentication-operator-69f744f599-wrcjq\" (UID: \"1dc93346-dc91-4c6e-9567-0b273ed77af7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrcjq" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.186443 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1dc93346-dc91-4c6e-9567-0b273ed77af7-serving-cert\") pod \"authentication-operator-69f744f599-wrcjq\" (UID: \"1dc93346-dc91-4c6e-9567-0b273ed77af7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrcjq" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.186473 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4f51b57-041d-4009-9db3-3579fa7bb84c-trusted-ca\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.186526 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1dc93346-dc91-4c6e-9567-0b273ed77af7-service-ca-bundle\") pod \"authentication-operator-69f744f599-wrcjq\" (UID: \"1dc93346-dc91-4c6e-9567-0b273ed77af7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrcjq" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.186553 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a4f51b57-041d-4009-9db3-3579fa7bb84c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.186587 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a4f51b57-041d-4009-9db3-3579fa7bb84c-registry-tls\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.186615 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f870c556-621f-4517-b1df-4e528a96f44f-metrics-certs\") pod \"router-default-5444994796-jfp2b\" (UID: \"f870c556-621f-4517-b1df-4e528a96f44f\") " pod="openshift-ingress/router-default-5444994796-jfp2b" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.186653 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.186681 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3850383-5213-4cac-ae24-9f403ea96597-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vng26\" (UID: \"e3850383-5213-4cac-ae24-9f403ea96597\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vng26" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.186746 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a4f51b57-041d-4009-9db3-3579fa7bb84c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.186787 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpbxh\" (UniqueName: \"kubernetes.io/projected/a4f51b57-041d-4009-9db3-3579fa7bb84c-kube-api-access-tpbxh\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: E1007 19:02:43.188850 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:43.688826763 +0000 UTC m=+152.510865400 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.193264 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bk8lv" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.216962 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zt2mx" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.219615 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rp8vt"] Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.225874 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhgpr" Oct 07 19:02:43 crc kubenswrapper[4825]: W1007 19:02:43.268343 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d77404b_ecd2_497c_9f7c_ec1ff470755e.slice/crio-502cc23db4f05c333f81227e1eebee85b811fcea08fadcc61e9b5ffd022b0590 WatchSource:0}: Error finding container 502cc23db4f05c333f81227e1eebee85b811fcea08fadcc61e9b5ffd022b0590: Status 404 returned error can't find the container with id 502cc23db4f05c333f81227e1eebee85b811fcea08fadcc61e9b5ffd022b0590 Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.288475 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.289022 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a4f51b57-041d-4009-9db3-3579fa7bb84c-registry-tls\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.289181 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3cb31047-587d-4169-ae87-cd57a2186127-proxy-tls\") pod \"machine-config-operator-74547568cd-254wk\" (UID: \"3cb31047-587d-4169-ae87-cd57a2186127\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-254wk" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.289320 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f870c556-621f-4517-b1df-4e528a96f44f-metrics-certs\") pod \"router-default-5444994796-jfp2b\" (UID: \"f870c556-621f-4517-b1df-4e528a96f44f\") " pod="openshift-ingress/router-default-5444994796-jfp2b" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.289401 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9m8r\" (UniqueName: \"kubernetes.io/projected/37b43fb6-6cdc-40fd-b4db-d044e8e2d630-kube-api-access-w9m8r\") pod \"migrator-59844c95c7-qwqs9\" (UID: \"37b43fb6-6cdc-40fd-b4db-d044e8e2d630\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qwqs9" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.289480 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3850383-5213-4cac-ae24-9f403ea96597-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vng26\" (UID: \"e3850383-5213-4cac-ae24-9f403ea96597\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vng26" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.289572 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44kv9\" (UniqueName: \"kubernetes.io/projected/c2a1e284-fa53-48bf-a60b-51783dcc8a21-kube-api-access-44kv9\") pod \"machine-config-server-qbvdj\" (UID: \"c2a1e284-fa53-48bf-a60b-51783dcc8a21\") " pod="openshift-machine-config-operator/machine-config-server-qbvdj" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.289662 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtkdg\" (UniqueName: \"kubernetes.io/projected/2f07afbb-46fd-4c4a-b791-2798ecc11ca0-kube-api-access-mtkdg\") pod \"service-ca-operator-777779d784-g8q6q\" (UID: \"2f07afbb-46fd-4c4a-b791-2798ecc11ca0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g8q6q" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.289732 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aeae2a9-70a6-4813-878c-ea9555215b74-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-b8qt8\" (UID: \"0aeae2a9-70a6-4813-878c-ea9555215b74\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b8qt8" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.289826 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0fbd1854-844d-4a1e-a44d-35daa5ee8a28-serving-cert\") pod \"openshift-config-operator-7777fb866f-7rf45\" (UID: \"0fbd1854-844d-4a1e-a44d-35daa5ee8a28\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7rf45" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.289997 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7wv4\" (UniqueName: \"kubernetes.io/projected/f53ff8ef-e493-4e2a-9895-e17bb40c8945-kube-api-access-s7wv4\") pod \"package-server-manager-789f6589d5-dx9x2\" (UID: \"f53ff8ef-e493-4e2a-9895-e17bb40c8945\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dx9x2" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.290082 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2xcs\" (UniqueName: \"kubernetes.io/projected/427677ce-ec80-45d0-adff-6ca227d075be-kube-api-access-x2xcs\") pod \"csi-hostpathplugin-mnjlm\" (UID: \"427677ce-ec80-45d0-adff-6ca227d075be\") " pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.290549 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpbxh\" (UniqueName: \"kubernetes.io/projected/a4f51b57-041d-4009-9db3-3579fa7bb84c-kube-api-access-tpbxh\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.290637 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a4f51b57-041d-4009-9db3-3579fa7bb84c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.290728 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c2a1e284-fa53-48bf-a60b-51783dcc8a21-node-bootstrap-token\") pod \"machine-config-server-qbvdj\" (UID: \"c2a1e284-fa53-48bf-a60b-51783dcc8a21\") " pod="openshift-machine-config-operator/machine-config-server-qbvdj" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.290797 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c2a1e284-fa53-48bf-a60b-51783dcc8a21-certs\") pod \"machine-config-server-qbvdj\" (UID: \"c2a1e284-fa53-48bf-a60b-51783dcc8a21\") " pod="openshift-machine-config-operator/machine-config-server-qbvdj" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.290862 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9486410b-2d41-458b-b7d0-1bdd4aeedd09-webhook-cert\") pod \"packageserver-d55dfcdfc-fww5f\" (UID: \"9486410b-2d41-458b-b7d0-1bdd4aeedd09\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fww5f" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.290929 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f870c556-621f-4517-b1df-4e528a96f44f-service-ca-bundle\") pod \"router-default-5444994796-jfp2b\" (UID: \"f870c556-621f-4517-b1df-4e528a96f44f\") " pod="openshift-ingress/router-default-5444994796-jfp2b" Oct 07 19:02:43 crc kubenswrapper[4825]: E1007 19:02:43.290973 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:43.790945683 +0000 UTC m=+152.612984320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.291064 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b5f06bf6-da11-4997-b46b-c1abb7030253-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7dfj7\" (UID: \"b5f06bf6-da11-4997-b46b-c1abb7030253\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7dfj7" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.291261 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3950d620-3e88-48fd-823a-f0ab8772ff5b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9rmtb\" (UID: \"3950d620-3e88-48fd-823a-f0ab8772ff5b\") " pod="openshift-marketplace/marketplace-operator-79b997595-9rmtb" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.291427 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1dc93346-dc91-4c6e-9567-0b273ed77af7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wrcjq\" (UID: \"1dc93346-dc91-4c6e-9567-0b273ed77af7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrcjq" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.291470 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3850383-5213-4cac-ae24-9f403ea96597-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vng26\" (UID: \"e3850383-5213-4cac-ae24-9f403ea96597\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vng26" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.291510 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ab4d6007-b58e-4566-9bc5-4d24b761a4ac-signing-key\") pod \"service-ca-9c57cc56f-8xv6b\" (UID: \"ab4d6007-b58e-4566-9bc5-4d24b761a4ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-8xv6b" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.291530 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9486410b-2d41-458b-b7d0-1bdd4aeedd09-apiservice-cert\") pod \"packageserver-d55dfcdfc-fww5f\" (UID: \"9486410b-2d41-458b-b7d0-1bdd4aeedd09\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fww5f" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.291566 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f870c556-621f-4517-b1df-4e528a96f44f-default-certificate\") pod \"router-default-5444994796-jfp2b\" (UID: \"f870c556-621f-4517-b1df-4e528a96f44f\") " pod="openshift-ingress/router-default-5444994796-jfp2b" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.292579 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f870c556-621f-4517-b1df-4e528a96f44f-service-ca-bundle\") pod \"router-default-5444994796-jfp2b\" (UID: \"f870c556-621f-4517-b1df-4e528a96f44f\") " pod="openshift-ingress/router-default-5444994796-jfp2b" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.292697 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3850383-5213-4cac-ae24-9f403ea96597-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vng26\" (UID: \"e3850383-5213-4cac-ae24-9f403ea96597\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vng26" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.292754 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fsdc\" (UniqueName: \"kubernetes.io/projected/4c1decdc-5272-4f1d-8504-2d138b4bd138-kube-api-access-8fsdc\") pod \"ingress-canary-mxsfh\" (UID: \"4c1decdc-5272-4f1d-8504-2d138b4bd138\") " pod="openshift-ingress-canary/ingress-canary-mxsfh" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.292775 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/629410db-e237-4f48-8ad8-9a1b7d2edfec-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-qwwdm\" (UID: \"629410db-e237-4f48-8ad8-9a1b7d2edfec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qwwdm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.292794 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f53ff8ef-e493-4e2a-9895-e17bb40c8945-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dx9x2\" (UID: \"f53ff8ef-e493-4e2a-9895-e17bb40c8945\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dx9x2" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.292815 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/31254990-7f83-47db-93e7-267da242edd3-config-volume\") pod \"dns-default-lmk59\" (UID: \"31254990-7f83-47db-93e7-267da242edd3\") " pod="openshift-dns/dns-default-lmk59" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.292852 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f870c556-621f-4517-b1df-4e528a96f44f-stats-auth\") pod \"router-default-5444994796-jfp2b\" (UID: \"f870c556-621f-4517-b1df-4e528a96f44f\") " pod="openshift-ingress/router-default-5444994796-jfp2b" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.292877 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ab4d6007-b58e-4566-9bc5-4d24b761a4ac-signing-cabundle\") pod \"service-ca-9c57cc56f-8xv6b\" (UID: \"ab4d6007-b58e-4566-9bc5-4d24b761a4ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-8xv6b" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.292895 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/427677ce-ec80-45d0-adff-6ca227d075be-socket-dir\") pod \"csi-hostpathplugin-mnjlm\" (UID: \"427677ce-ec80-45d0-adff-6ca227d075be\") " pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.292971 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vmx5\" (UniqueName: \"kubernetes.io/projected/b5f06bf6-da11-4997-b46b-c1abb7030253-kube-api-access-5vmx5\") pod \"multus-admission-controller-857f4d67dd-7dfj7\" (UID: \"b5f06bf6-da11-4997-b46b-c1abb7030253\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7dfj7" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.292995 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d9e8c599-7697-44cb-84f5-e21f5f88111e-srv-cert\") pod \"catalog-operator-68c6474976-xmvk6\" (UID: \"d9e8c599-7697-44cb-84f5-e21f5f88111e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmvk6" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.293056 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3cb31047-587d-4169-ae87-cd57a2186127-auth-proxy-config\") pod \"machine-config-operator-74547568cd-254wk\" (UID: \"3cb31047-587d-4169-ae87-cd57a2186127\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-254wk" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.293083 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgddq\" (UniqueName: \"kubernetes.io/projected/629410db-e237-4f48-8ad8-9a1b7d2edfec-kube-api-access-vgddq\") pod \"kube-storage-version-migrator-operator-b67b599dd-qwwdm\" (UID: \"629410db-e237-4f48-8ad8-9a1b7d2edfec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qwwdm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.293127 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wf2m\" (UniqueName: \"kubernetes.io/projected/962646e1-6f06-40ed-a19a-d73f55b93d95-kube-api-access-8wf2m\") pod \"control-plane-machine-set-operator-78cbb6b69f-mhcpj\" (UID: \"962646e1-6f06-40ed-a19a-d73f55b93d95\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mhcpj" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.293156 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3cb31047-587d-4169-ae87-cd57a2186127-images\") pod \"machine-config-operator-74547568cd-254wk\" (UID: \"3cb31047-587d-4169-ae87-cd57a2186127\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-254wk" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.293260 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sw7x\" (UniqueName: \"kubernetes.io/projected/f870c556-621f-4517-b1df-4e528a96f44f-kube-api-access-9sw7x\") pod \"router-default-5444994796-jfp2b\" (UID: \"f870c556-621f-4517-b1df-4e528a96f44f\") " pod="openshift-ingress/router-default-5444994796-jfp2b" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.293356 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9486410b-2d41-458b-b7d0-1bdd4aeedd09-tmpfs\") pod \"packageserver-d55dfcdfc-fww5f\" (UID: \"9486410b-2d41-458b-b7d0-1bdd4aeedd09\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fww5f" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.293389 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b305d341-f68e-40db-b37c-11660cdac447-secret-volume\") pod \"collect-profiles-29331060-f6rbn\" (UID: \"b305d341-f68e-40db-b37c-11660cdac447\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331060-f6rbn" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.293419 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/427677ce-ec80-45d0-adff-6ca227d075be-mountpoint-dir\") pod \"csi-hostpathplugin-mnjlm\" (UID: \"427677ce-ec80-45d0-adff-6ca227d075be\") " pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.293512 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3950d620-3e88-48fd-823a-f0ab8772ff5b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9rmtb\" (UID: \"3950d620-3e88-48fd-823a-f0ab8772ff5b\") " pod="openshift-marketplace/marketplace-operator-79b997595-9rmtb" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.293550 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkhrv\" (UniqueName: \"kubernetes.io/projected/1dc93346-dc91-4c6e-9567-0b273ed77af7-kube-api-access-mkhrv\") pod \"authentication-operator-69f744f599-wrcjq\" (UID: \"1dc93346-dc91-4c6e-9567-0b273ed77af7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrcjq" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.293581 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0aeae2a9-70a6-4813-878c-ea9555215b74-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-b8qt8\" (UID: \"0aeae2a9-70a6-4813-878c-ea9555215b74\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b8qt8" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.293652 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a4f51b57-041d-4009-9db3-3579fa7bb84c-registry-certificates\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.293692 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qbph\" (UniqueName: \"kubernetes.io/projected/3cb31047-587d-4169-ae87-cd57a2186127-kube-api-access-2qbph\") pod \"machine-config-operator-74547568cd-254wk\" (UID: \"3cb31047-587d-4169-ae87-cd57a2186127\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-254wk" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.293726 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f07afbb-46fd-4c4a-b791-2798ecc11ca0-config\") pod \"service-ca-operator-777779d784-g8q6q\" (UID: \"2f07afbb-46fd-4c4a-b791-2798ecc11ca0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g8q6q" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.293797 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aeae2a9-70a6-4813-878c-ea9555215b74-config\") pod \"kube-controller-manager-operator-78b949d7b-b8qt8\" (UID: \"0aeae2a9-70a6-4813-878c-ea9555215b74\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b8qt8" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.293827 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/427677ce-ec80-45d0-adff-6ca227d075be-csi-data-dir\") pod \"csi-hostpathplugin-mnjlm\" (UID: \"427677ce-ec80-45d0-adff-6ca227d075be\") " pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.293920 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2xhp\" (UniqueName: \"kubernetes.io/projected/3950d620-3e88-48fd-823a-f0ab8772ff5b-kube-api-access-p2xhp\") pod \"marketplace-operator-79b997595-9rmtb\" (UID: \"3950d620-3e88-48fd-823a-f0ab8772ff5b\") " pod="openshift-marketplace/marketplace-operator-79b997595-9rmtb" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.293961 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7lgg\" (UniqueName: \"kubernetes.io/projected/9486410b-2d41-458b-b7d0-1bdd4aeedd09-kube-api-access-w7lgg\") pod \"packageserver-d55dfcdfc-fww5f\" (UID: \"9486410b-2d41-458b-b7d0-1bdd4aeedd09\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fww5f" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.294495 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmr6w\" (UniqueName: \"kubernetes.io/projected/ab4d6007-b58e-4566-9bc5-4d24b761a4ac-kube-api-access-hmr6w\") pod \"service-ca-9c57cc56f-8xv6b\" (UID: \"ab4d6007-b58e-4566-9bc5-4d24b761a4ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-8xv6b" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.294584 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/962646e1-6f06-40ed-a19a-d73f55b93d95-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mhcpj\" (UID: \"962646e1-6f06-40ed-a19a-d73f55b93d95\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mhcpj" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.294695 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a4f51b57-041d-4009-9db3-3579fa7bb84c-bound-sa-token\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.294728 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dc93346-dc91-4c6e-9567-0b273ed77af7-config\") pod \"authentication-operator-69f744f599-wrcjq\" (UID: \"1dc93346-dc91-4c6e-9567-0b273ed77af7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrcjq" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.294756 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrcbt\" (UniqueName: \"kubernetes.io/projected/31254990-7f83-47db-93e7-267da242edd3-kube-api-access-wrcbt\") pod \"dns-default-lmk59\" (UID: \"31254990-7f83-47db-93e7-267da242edd3\") " pod="openshift-dns/dns-default-lmk59" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.294815 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f07afbb-46fd-4c4a-b791-2798ecc11ca0-serving-cert\") pod \"service-ca-operator-777779d784-g8q6q\" (UID: \"2f07afbb-46fd-4c4a-b791-2798ecc11ca0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g8q6q" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.294872 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/427677ce-ec80-45d0-adff-6ca227d075be-registration-dir\") pod \"csi-hostpathplugin-mnjlm\" (UID: \"427677ce-ec80-45d0-adff-6ca227d075be\") " pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.294896 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s9sd\" (UniqueName: \"kubernetes.io/projected/d9e8c599-7697-44cb-84f5-e21f5f88111e-kube-api-access-2s9sd\") pod \"catalog-operator-68c6474976-xmvk6\" (UID: \"d9e8c599-7697-44cb-84f5-e21f5f88111e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmvk6" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.294917 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c1decdc-5272-4f1d-8504-2d138b4bd138-cert\") pod \"ingress-canary-mxsfh\" (UID: \"4c1decdc-5272-4f1d-8504-2d138b4bd138\") " pod="openshift-ingress-canary/ingress-canary-mxsfh" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.294943 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1dc93346-dc91-4c6e-9567-0b273ed77af7-serving-cert\") pod \"authentication-operator-69f744f599-wrcjq\" (UID: \"1dc93346-dc91-4c6e-9567-0b273ed77af7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrcjq" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.294961 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b305d341-f68e-40db-b37c-11660cdac447-config-volume\") pod \"collect-profiles-29331060-f6rbn\" (UID: \"b305d341-f68e-40db-b37c-11660cdac447\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331060-f6rbn" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.294977 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d9e8c599-7697-44cb-84f5-e21f5f88111e-profile-collector-cert\") pod \"catalog-operator-68c6474976-xmvk6\" (UID: \"d9e8c599-7697-44cb-84f5-e21f5f88111e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmvk6" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.295002 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82xf9\" (UniqueName: \"kubernetes.io/projected/0fbd1854-844d-4a1e-a44d-35daa5ee8a28-kube-api-access-82xf9\") pod \"openshift-config-operator-7777fb866f-7rf45\" (UID: \"0fbd1854-844d-4a1e-a44d-35daa5ee8a28\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7rf45" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.295042 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4f51b57-041d-4009-9db3-3579fa7bb84c-trusted-ca\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.295073 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0fbd1854-844d-4a1e-a44d-35daa5ee8a28-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7rf45\" (UID: \"0fbd1854-844d-4a1e-a44d-35daa5ee8a28\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7rf45" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.297541 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f870c556-621f-4517-b1df-4e528a96f44f-metrics-certs\") pod \"router-default-5444994796-jfp2b\" (UID: \"f870c556-621f-4517-b1df-4e528a96f44f\") " pod="openshift-ingress/router-default-5444994796-jfp2b" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.298045 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3850383-5213-4cac-ae24-9f403ea96597-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vng26\" (UID: \"e3850383-5213-4cac-ae24-9f403ea96597\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vng26" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.302070 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj952\" (UniqueName: \"kubernetes.io/projected/b305d341-f68e-40db-b37c-11660cdac447-kube-api-access-jj952\") pod \"collect-profiles-29331060-f6rbn\" (UID: \"b305d341-f68e-40db-b37c-11660cdac447\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331060-f6rbn" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.302101 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31254990-7f83-47db-93e7-267da242edd3-metrics-tls\") pod \"dns-default-lmk59\" (UID: \"31254990-7f83-47db-93e7-267da242edd3\") " pod="openshift-dns/dns-default-lmk59" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.302136 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/629410db-e237-4f48-8ad8-9a1b7d2edfec-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-qwwdm\" (UID: \"629410db-e237-4f48-8ad8-9a1b7d2edfec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qwwdm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.302161 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/427677ce-ec80-45d0-adff-6ca227d075be-plugins-dir\") pod \"csi-hostpathplugin-mnjlm\" (UID: \"427677ce-ec80-45d0-adff-6ca227d075be\") " pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.302210 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a4f51b57-041d-4009-9db3-3579fa7bb84c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.302440 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1dc93346-dc91-4c6e-9567-0b273ed77af7-service-ca-bundle\") pod \"authentication-operator-69f744f599-wrcjq\" (UID: \"1dc93346-dc91-4c6e-9567-0b273ed77af7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrcjq" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.302614 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dc93346-dc91-4c6e-9567-0b273ed77af7-config\") pod \"authentication-operator-69f744f599-wrcjq\" (UID: \"1dc93346-dc91-4c6e-9567-0b273ed77af7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrcjq" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.302955 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1dc93346-dc91-4c6e-9567-0b273ed77af7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wrcjq\" (UID: \"1dc93346-dc91-4c6e-9567-0b273ed77af7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrcjq" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.303443 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1dc93346-dc91-4c6e-9567-0b273ed77af7-service-ca-bundle\") pod \"authentication-operator-69f744f599-wrcjq\" (UID: \"1dc93346-dc91-4c6e-9567-0b273ed77af7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrcjq" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.303914 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a4f51b57-041d-4009-9db3-3579fa7bb84c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.307009 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4f51b57-041d-4009-9db3-3579fa7bb84c-trusted-ca\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.307522 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a4f51b57-041d-4009-9db3-3579fa7bb84c-registry-certificates\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.308536 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3850383-5213-4cac-ae24-9f403ea96597-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vng26\" (UID: \"e3850383-5213-4cac-ae24-9f403ea96597\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vng26" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.308648 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a4f51b57-041d-4009-9db3-3579fa7bb84c-registry-tls\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.308763 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a4f51b57-041d-4009-9db3-3579fa7bb84c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.310065 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f870c556-621f-4517-b1df-4e528a96f44f-default-certificate\") pod \"router-default-5444994796-jfp2b\" (UID: \"f870c556-621f-4517-b1df-4e528a96f44f\") " pod="openshift-ingress/router-default-5444994796-jfp2b" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.314113 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1dc93346-dc91-4c6e-9567-0b273ed77af7-serving-cert\") pod \"authentication-operator-69f744f599-wrcjq\" (UID: \"1dc93346-dc91-4c6e-9567-0b273ed77af7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrcjq" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.318149 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-x74mv"] Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.321612 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f870c556-621f-4517-b1df-4e528a96f44f-stats-auth\") pod \"router-default-5444994796-jfp2b\" (UID: \"f870c556-621f-4517-b1df-4e528a96f44f\") " pod="openshift-ingress/router-default-5444994796-jfp2b" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.348550 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sw7x\" (UniqueName: \"kubernetes.io/projected/f870c556-621f-4517-b1df-4e528a96f44f-kube-api-access-9sw7x\") pod \"router-default-5444994796-jfp2b\" (UID: \"f870c556-621f-4517-b1df-4e528a96f44f\") " pod="openshift-ingress/router-default-5444994796-jfp2b" Oct 07 19:02:43 crc kubenswrapper[4825]: W1007 19:02:43.366635 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c0bb39b_ac5f_48e5_87a8_80f21b338c02.slice/crio-e3b795932049d99605f3d7812391526232b474db7a7a1128678e64b70335d5b9 WatchSource:0}: Error finding container e3b795932049d99605f3d7812391526232b474db7a7a1128678e64b70335d5b9: Status 404 returned error can't find the container with id e3b795932049d99605f3d7812391526232b474db7a7a1128678e64b70335d5b9 Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.383806 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkhrv\" (UniqueName: \"kubernetes.io/projected/1dc93346-dc91-4c6e-9567-0b273ed77af7-kube-api-access-mkhrv\") pod \"authentication-operator-69f744f599-wrcjq\" (UID: \"1dc93346-dc91-4c6e-9567-0b273ed77af7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrcjq" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.401468 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-sqfnk"] Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.402861 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3850383-5213-4cac-ae24-9f403ea96597-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vng26\" (UID: \"e3850383-5213-4cac-ae24-9f403ea96597\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vng26" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.407936 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7wv4\" (UniqueName: \"kubernetes.io/projected/f53ff8ef-e493-4e2a-9895-e17bb40c8945-kube-api-access-s7wv4\") pod \"package-server-manager-789f6589d5-dx9x2\" (UID: \"f53ff8ef-e493-4e2a-9895-e17bb40c8945\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dx9x2" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.407967 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2xcs\" (UniqueName: \"kubernetes.io/projected/427677ce-ec80-45d0-adff-6ca227d075be-kube-api-access-x2xcs\") pod \"csi-hostpathplugin-mnjlm\" (UID: \"427677ce-ec80-45d0-adff-6ca227d075be\") " pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.407996 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c2a1e284-fa53-48bf-a60b-51783dcc8a21-certs\") pod \"machine-config-server-qbvdj\" (UID: \"c2a1e284-fa53-48bf-a60b-51783dcc8a21\") " pod="openshift-machine-config-operator/machine-config-server-qbvdj" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408011 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9486410b-2d41-458b-b7d0-1bdd4aeedd09-webhook-cert\") pod \"packageserver-d55dfcdfc-fww5f\" (UID: \"9486410b-2d41-458b-b7d0-1bdd4aeedd09\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fww5f" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408028 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c2a1e284-fa53-48bf-a60b-51783dcc8a21-node-bootstrap-token\") pod \"machine-config-server-qbvdj\" (UID: \"c2a1e284-fa53-48bf-a60b-51783dcc8a21\") " pod="openshift-machine-config-operator/machine-config-server-qbvdj" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408047 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3950d620-3e88-48fd-823a-f0ab8772ff5b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9rmtb\" (UID: \"3950d620-3e88-48fd-823a-f0ab8772ff5b\") " pod="openshift-marketplace/marketplace-operator-79b997595-9rmtb" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408066 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b5f06bf6-da11-4997-b46b-c1abb7030253-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7dfj7\" (UID: \"b5f06bf6-da11-4997-b46b-c1abb7030253\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7dfj7" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408084 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ab4d6007-b58e-4566-9bc5-4d24b761a4ac-signing-key\") pod \"service-ca-9c57cc56f-8xv6b\" (UID: \"ab4d6007-b58e-4566-9bc5-4d24b761a4ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-8xv6b" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408100 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9486410b-2d41-458b-b7d0-1bdd4aeedd09-apiservice-cert\") pod \"packageserver-d55dfcdfc-fww5f\" (UID: \"9486410b-2d41-458b-b7d0-1bdd4aeedd09\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fww5f" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408119 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fsdc\" (UniqueName: \"kubernetes.io/projected/4c1decdc-5272-4f1d-8504-2d138b4bd138-kube-api-access-8fsdc\") pod \"ingress-canary-mxsfh\" (UID: \"4c1decdc-5272-4f1d-8504-2d138b4bd138\") " pod="openshift-ingress-canary/ingress-canary-mxsfh" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408136 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f53ff8ef-e493-4e2a-9895-e17bb40c8945-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dx9x2\" (UID: \"f53ff8ef-e493-4e2a-9895-e17bb40c8945\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dx9x2" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408153 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/31254990-7f83-47db-93e7-267da242edd3-config-volume\") pod \"dns-default-lmk59\" (UID: \"31254990-7f83-47db-93e7-267da242edd3\") " pod="openshift-dns/dns-default-lmk59" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408170 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/629410db-e237-4f48-8ad8-9a1b7d2edfec-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-qwwdm\" (UID: \"629410db-e237-4f48-8ad8-9a1b7d2edfec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qwwdm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408189 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ab4d6007-b58e-4566-9bc5-4d24b761a4ac-signing-cabundle\") pod \"service-ca-9c57cc56f-8xv6b\" (UID: \"ab4d6007-b58e-4566-9bc5-4d24b761a4ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-8xv6b" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408205 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/427677ce-ec80-45d0-adff-6ca227d075be-socket-dir\") pod \"csi-hostpathplugin-mnjlm\" (UID: \"427677ce-ec80-45d0-adff-6ca227d075be\") " pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408224 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vmx5\" (UniqueName: \"kubernetes.io/projected/b5f06bf6-da11-4997-b46b-c1abb7030253-kube-api-access-5vmx5\") pod \"multus-admission-controller-857f4d67dd-7dfj7\" (UID: \"b5f06bf6-da11-4997-b46b-c1abb7030253\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7dfj7" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408253 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d9e8c599-7697-44cb-84f5-e21f5f88111e-srv-cert\") pod \"catalog-operator-68c6474976-xmvk6\" (UID: \"d9e8c599-7697-44cb-84f5-e21f5f88111e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmvk6" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408270 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3cb31047-587d-4169-ae87-cd57a2186127-auth-proxy-config\") pod \"machine-config-operator-74547568cd-254wk\" (UID: \"3cb31047-587d-4169-ae87-cd57a2186127\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-254wk" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408286 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgddq\" (UniqueName: \"kubernetes.io/projected/629410db-e237-4f48-8ad8-9a1b7d2edfec-kube-api-access-vgddq\") pod \"kube-storage-version-migrator-operator-b67b599dd-qwwdm\" (UID: \"629410db-e237-4f48-8ad8-9a1b7d2edfec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qwwdm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408305 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wf2m\" (UniqueName: \"kubernetes.io/projected/962646e1-6f06-40ed-a19a-d73f55b93d95-kube-api-access-8wf2m\") pod \"control-plane-machine-set-operator-78cbb6b69f-mhcpj\" (UID: \"962646e1-6f06-40ed-a19a-d73f55b93d95\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mhcpj" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408323 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3cb31047-587d-4169-ae87-cd57a2186127-images\") pod \"machine-config-operator-74547568cd-254wk\" (UID: \"3cb31047-587d-4169-ae87-cd57a2186127\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-254wk" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408344 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9486410b-2d41-458b-b7d0-1bdd4aeedd09-tmpfs\") pod \"packageserver-d55dfcdfc-fww5f\" (UID: \"9486410b-2d41-458b-b7d0-1bdd4aeedd09\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fww5f" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408373 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b305d341-f68e-40db-b37c-11660cdac447-secret-volume\") pod \"collect-profiles-29331060-f6rbn\" (UID: \"b305d341-f68e-40db-b37c-11660cdac447\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331060-f6rbn" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408390 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/427677ce-ec80-45d0-adff-6ca227d075be-mountpoint-dir\") pod \"csi-hostpathplugin-mnjlm\" (UID: \"427677ce-ec80-45d0-adff-6ca227d075be\") " pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408409 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3950d620-3e88-48fd-823a-f0ab8772ff5b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9rmtb\" (UID: \"3950d620-3e88-48fd-823a-f0ab8772ff5b\") " pod="openshift-marketplace/marketplace-operator-79b997595-9rmtb" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408426 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0aeae2a9-70a6-4813-878c-ea9555215b74-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-b8qt8\" (UID: \"0aeae2a9-70a6-4813-878c-ea9555215b74\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b8qt8" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408654 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qbph\" (UniqueName: \"kubernetes.io/projected/3cb31047-587d-4169-ae87-cd57a2186127-kube-api-access-2qbph\") pod \"machine-config-operator-74547568cd-254wk\" (UID: \"3cb31047-587d-4169-ae87-cd57a2186127\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-254wk" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408670 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f07afbb-46fd-4c4a-b791-2798ecc11ca0-config\") pod \"service-ca-operator-777779d784-g8q6q\" (UID: \"2f07afbb-46fd-4c4a-b791-2798ecc11ca0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g8q6q" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408687 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/427677ce-ec80-45d0-adff-6ca227d075be-csi-data-dir\") pod \"csi-hostpathplugin-mnjlm\" (UID: \"427677ce-ec80-45d0-adff-6ca227d075be\") " pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408703 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aeae2a9-70a6-4813-878c-ea9555215b74-config\") pod \"kube-controller-manager-operator-78b949d7b-b8qt8\" (UID: \"0aeae2a9-70a6-4813-878c-ea9555215b74\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b8qt8" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408720 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2xhp\" (UniqueName: \"kubernetes.io/projected/3950d620-3e88-48fd-823a-f0ab8772ff5b-kube-api-access-p2xhp\") pod \"marketplace-operator-79b997595-9rmtb\" (UID: \"3950d620-3e88-48fd-823a-f0ab8772ff5b\") " pod="openshift-marketplace/marketplace-operator-79b997595-9rmtb" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408738 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmr6w\" (UniqueName: \"kubernetes.io/projected/ab4d6007-b58e-4566-9bc5-4d24b761a4ac-kube-api-access-hmr6w\") pod \"service-ca-9c57cc56f-8xv6b\" (UID: \"ab4d6007-b58e-4566-9bc5-4d24b761a4ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-8xv6b" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408754 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7lgg\" (UniqueName: \"kubernetes.io/projected/9486410b-2d41-458b-b7d0-1bdd4aeedd09-kube-api-access-w7lgg\") pod \"packageserver-d55dfcdfc-fww5f\" (UID: \"9486410b-2d41-458b-b7d0-1bdd4aeedd09\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fww5f" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408775 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/962646e1-6f06-40ed-a19a-d73f55b93d95-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mhcpj\" (UID: \"962646e1-6f06-40ed-a19a-d73f55b93d95\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mhcpj" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408800 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrcbt\" (UniqueName: \"kubernetes.io/projected/31254990-7f83-47db-93e7-267da242edd3-kube-api-access-wrcbt\") pod \"dns-default-lmk59\" (UID: \"31254990-7f83-47db-93e7-267da242edd3\") " pod="openshift-dns/dns-default-lmk59" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408814 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f07afbb-46fd-4c4a-b791-2798ecc11ca0-serving-cert\") pod \"service-ca-operator-777779d784-g8q6q\" (UID: \"2f07afbb-46fd-4c4a-b791-2798ecc11ca0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g8q6q" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408832 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/427677ce-ec80-45d0-adff-6ca227d075be-registration-dir\") pod \"csi-hostpathplugin-mnjlm\" (UID: \"427677ce-ec80-45d0-adff-6ca227d075be\") " pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408848 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s9sd\" (UniqueName: \"kubernetes.io/projected/d9e8c599-7697-44cb-84f5-e21f5f88111e-kube-api-access-2s9sd\") pod \"catalog-operator-68c6474976-xmvk6\" (UID: \"d9e8c599-7697-44cb-84f5-e21f5f88111e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmvk6" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408866 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c1decdc-5272-4f1d-8504-2d138b4bd138-cert\") pod \"ingress-canary-mxsfh\" (UID: \"4c1decdc-5272-4f1d-8504-2d138b4bd138\") " pod="openshift-ingress-canary/ingress-canary-mxsfh" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408882 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b305d341-f68e-40db-b37c-11660cdac447-config-volume\") pod \"collect-profiles-29331060-f6rbn\" (UID: \"b305d341-f68e-40db-b37c-11660cdac447\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331060-f6rbn" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408896 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d9e8c599-7697-44cb-84f5-e21f5f88111e-profile-collector-cert\") pod \"catalog-operator-68c6474976-xmvk6\" (UID: \"d9e8c599-7697-44cb-84f5-e21f5f88111e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmvk6" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408915 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82xf9\" (UniqueName: \"kubernetes.io/projected/0fbd1854-844d-4a1e-a44d-35daa5ee8a28-kube-api-access-82xf9\") pod \"openshift-config-operator-7777fb866f-7rf45\" (UID: \"0fbd1854-844d-4a1e-a44d-35daa5ee8a28\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7rf45" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408934 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0fbd1854-844d-4a1e-a44d-35daa5ee8a28-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7rf45\" (UID: \"0fbd1854-844d-4a1e-a44d-35daa5ee8a28\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7rf45" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408971 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj952\" (UniqueName: \"kubernetes.io/projected/b305d341-f68e-40db-b37c-11660cdac447-kube-api-access-jj952\") pod \"collect-profiles-29331060-f6rbn\" (UID: \"b305d341-f68e-40db-b37c-11660cdac447\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331060-f6rbn" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.408986 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31254990-7f83-47db-93e7-267da242edd3-metrics-tls\") pod \"dns-default-lmk59\" (UID: \"31254990-7f83-47db-93e7-267da242edd3\") " pod="openshift-dns/dns-default-lmk59" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.409004 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/629410db-e237-4f48-8ad8-9a1b7d2edfec-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-qwwdm\" (UID: \"629410db-e237-4f48-8ad8-9a1b7d2edfec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qwwdm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.409019 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/427677ce-ec80-45d0-adff-6ca227d075be-plugins-dir\") pod \"csi-hostpathplugin-mnjlm\" (UID: \"427677ce-ec80-45d0-adff-6ca227d075be\") " pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.409038 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3cb31047-587d-4169-ae87-cd57a2186127-proxy-tls\") pod \"machine-config-operator-74547568cd-254wk\" (UID: \"3cb31047-587d-4169-ae87-cd57a2186127\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-254wk" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.409055 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9m8r\" (UniqueName: \"kubernetes.io/projected/37b43fb6-6cdc-40fd-b4db-d044e8e2d630-kube-api-access-w9m8r\") pod \"migrator-59844c95c7-qwqs9\" (UID: \"37b43fb6-6cdc-40fd-b4db-d044e8e2d630\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qwqs9" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.409072 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44kv9\" (UniqueName: \"kubernetes.io/projected/c2a1e284-fa53-48bf-a60b-51783dcc8a21-kube-api-access-44kv9\") pod \"machine-config-server-qbvdj\" (UID: \"c2a1e284-fa53-48bf-a60b-51783dcc8a21\") " pod="openshift-machine-config-operator/machine-config-server-qbvdj" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.409090 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.409107 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aeae2a9-70a6-4813-878c-ea9555215b74-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-b8qt8\" (UID: \"0aeae2a9-70a6-4813-878c-ea9555215b74\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b8qt8" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.409124 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtkdg\" (UniqueName: \"kubernetes.io/projected/2f07afbb-46fd-4c4a-b791-2798ecc11ca0-kube-api-access-mtkdg\") pod \"service-ca-operator-777779d784-g8q6q\" (UID: \"2f07afbb-46fd-4c4a-b791-2798ecc11ca0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g8q6q" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.409140 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0fbd1854-844d-4a1e-a44d-35daa5ee8a28-serving-cert\") pod \"openshift-config-operator-7777fb866f-7rf45\" (UID: \"0fbd1854-844d-4a1e-a44d-35daa5ee8a28\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7rf45" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.410681 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3cb31047-587d-4169-ae87-cd57a2186127-images\") pod \"machine-config-operator-74547568cd-254wk\" (UID: \"3cb31047-587d-4169-ae87-cd57a2186127\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-254wk" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.415493 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/31254990-7f83-47db-93e7-267da242edd3-config-volume\") pod \"dns-default-lmk59\" (UID: \"31254990-7f83-47db-93e7-267da242edd3\") " pod="openshift-dns/dns-default-lmk59" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.415872 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/427677ce-ec80-45d0-adff-6ca227d075be-registration-dir\") pod \"csi-hostpathplugin-mnjlm\" (UID: \"427677ce-ec80-45d0-adff-6ca227d075be\") " pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.416113 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/629410db-e237-4f48-8ad8-9a1b7d2edfec-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-qwwdm\" (UID: \"629410db-e237-4f48-8ad8-9a1b7d2edfec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qwwdm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.416530 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3cb31047-587d-4169-ae87-cd57a2186127-auth-proxy-config\") pod \"machine-config-operator-74547568cd-254wk\" (UID: \"3cb31047-587d-4169-ae87-cd57a2186127\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-254wk" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.416824 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ab4d6007-b58e-4566-9bc5-4d24b761a4ac-signing-cabundle\") pod \"service-ca-9c57cc56f-8xv6b\" (UID: \"ab4d6007-b58e-4566-9bc5-4d24b761a4ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-8xv6b" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.416891 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/427677ce-ec80-45d0-adff-6ca227d075be-socket-dir\") pod \"csi-hostpathplugin-mnjlm\" (UID: \"427677ce-ec80-45d0-adff-6ca227d075be\") " pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.417345 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b305d341-f68e-40db-b37c-11660cdac447-config-volume\") pod \"collect-profiles-29331060-f6rbn\" (UID: \"b305d341-f68e-40db-b37c-11660cdac447\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331060-f6rbn" Oct 07 19:02:43 crc kubenswrapper[4825]: E1007 19:02:43.417600 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:43.917585027 +0000 UTC m=+152.739623664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.421754 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wrcjq" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.422367 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9486410b-2d41-458b-b7d0-1bdd4aeedd09-tmpfs\") pod \"packageserver-d55dfcdfc-fww5f\" (UID: \"9486410b-2d41-458b-b7d0-1bdd4aeedd09\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fww5f" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.423015 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0fbd1854-844d-4a1e-a44d-35daa5ee8a28-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7rf45\" (UID: \"0fbd1854-844d-4a1e-a44d-35daa5ee8a28\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7rf45" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.423320 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f07afbb-46fd-4c4a-b791-2798ecc11ca0-config\") pod \"service-ca-operator-777779d784-g8q6q\" (UID: \"2f07afbb-46fd-4c4a-b791-2798ecc11ca0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g8q6q" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.423431 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/427677ce-ec80-45d0-adff-6ca227d075be-csi-data-dir\") pod \"csi-hostpathplugin-mnjlm\" (UID: \"427677ce-ec80-45d0-adff-6ca227d075be\") " pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.424040 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aeae2a9-70a6-4813-878c-ea9555215b74-config\") pod \"kube-controller-manager-operator-78b949d7b-b8qt8\" (UID: \"0aeae2a9-70a6-4813-878c-ea9555215b74\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b8qt8" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.424186 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/427677ce-ec80-45d0-adff-6ca227d075be-mountpoint-dir\") pod \"csi-hostpathplugin-mnjlm\" (UID: \"427677ce-ec80-45d0-adff-6ca227d075be\") " pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.425621 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3950d620-3e88-48fd-823a-f0ab8772ff5b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9rmtb\" (UID: \"3950d620-3e88-48fd-823a-f0ab8772ff5b\") " pod="openshift-marketplace/marketplace-operator-79b997595-9rmtb" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.425751 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/427677ce-ec80-45d0-adff-6ca227d075be-plugins-dir\") pod \"csi-hostpathplugin-mnjlm\" (UID: \"427677ce-ec80-45d0-adff-6ca227d075be\") " pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.430481 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c1decdc-5272-4f1d-8504-2d138b4bd138-cert\") pod \"ingress-canary-mxsfh\" (UID: \"4c1decdc-5272-4f1d-8504-2d138b4bd138\") " pod="openshift-ingress-canary/ingress-canary-mxsfh" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.432760 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9486410b-2d41-458b-b7d0-1bdd4aeedd09-apiservice-cert\") pod \"packageserver-d55dfcdfc-fww5f\" (UID: \"9486410b-2d41-458b-b7d0-1bdd4aeedd09\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fww5f" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.434174 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/962646e1-6f06-40ed-a19a-d73f55b93d95-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mhcpj\" (UID: \"962646e1-6f06-40ed-a19a-d73f55b93d95\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mhcpj" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.434909 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f07afbb-46fd-4c4a-b791-2798ecc11ca0-serving-cert\") pod \"service-ca-operator-777779d784-g8q6q\" (UID: \"2f07afbb-46fd-4c4a-b791-2798ecc11ca0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g8q6q" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.435904 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f53ff8ef-e493-4e2a-9895-e17bb40c8945-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dx9x2\" (UID: \"f53ff8ef-e493-4e2a-9895-e17bb40c8945\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dx9x2" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.436435 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c2a1e284-fa53-48bf-a60b-51783dcc8a21-certs\") pod \"machine-config-server-qbvdj\" (UID: \"c2a1e284-fa53-48bf-a60b-51783dcc8a21\") " pod="openshift-machine-config-operator/machine-config-server-qbvdj" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.436819 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d9e8c599-7697-44cb-84f5-e21f5f88111e-srv-cert\") pod \"catalog-operator-68c6474976-xmvk6\" (UID: \"d9e8c599-7697-44cb-84f5-e21f5f88111e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmvk6" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.436895 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9486410b-2d41-458b-b7d0-1bdd4aeedd09-webhook-cert\") pod \"packageserver-d55dfcdfc-fww5f\" (UID: \"9486410b-2d41-458b-b7d0-1bdd4aeedd09\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fww5f" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.442348 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpbxh\" (UniqueName: \"kubernetes.io/projected/a4f51b57-041d-4009-9db3-3579fa7bb84c-kube-api-access-tpbxh\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.445948 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3950d620-3e88-48fd-823a-f0ab8772ff5b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9rmtb\" (UID: \"3950d620-3e88-48fd-823a-f0ab8772ff5b\") " pod="openshift-marketplace/marketplace-operator-79b997595-9rmtb" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.447695 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0fbd1854-844d-4a1e-a44d-35daa5ee8a28-serving-cert\") pod \"openshift-config-operator-7777fb866f-7rf45\" (UID: \"0fbd1854-844d-4a1e-a44d-35daa5ee8a28\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7rf45" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.447713 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/629410db-e237-4f48-8ad8-9a1b7d2edfec-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-qwwdm\" (UID: \"629410db-e237-4f48-8ad8-9a1b7d2edfec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qwwdm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.448112 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a4f51b57-041d-4009-9db3-3579fa7bb84c-bound-sa-token\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.448165 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d9e8c599-7697-44cb-84f5-e21f5f88111e-profile-collector-cert\") pod \"catalog-operator-68c6474976-xmvk6\" (UID: \"d9e8c599-7697-44cb-84f5-e21f5f88111e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmvk6" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.448362 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3cb31047-587d-4169-ae87-cd57a2186127-proxy-tls\") pod \"machine-config-operator-74547568cd-254wk\" (UID: \"3cb31047-587d-4169-ae87-cd57a2186127\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-254wk" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.448558 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b5f06bf6-da11-4997-b46b-c1abb7030253-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7dfj7\" (UID: \"b5f06bf6-da11-4997-b46b-c1abb7030253\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7dfj7" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.448772 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31254990-7f83-47db-93e7-267da242edd3-metrics-tls\") pod \"dns-default-lmk59\" (UID: \"31254990-7f83-47db-93e7-267da242edd3\") " pod="openshift-dns/dns-default-lmk59" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.450200 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c2a1e284-fa53-48bf-a60b-51783dcc8a21-node-bootstrap-token\") pod \"machine-config-server-qbvdj\" (UID: \"c2a1e284-fa53-48bf-a60b-51783dcc8a21\") " pod="openshift-machine-config-operator/machine-config-server-qbvdj" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.452650 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b305d341-f68e-40db-b37c-11660cdac447-secret-volume\") pod \"collect-profiles-29331060-f6rbn\" (UID: \"b305d341-f68e-40db-b37c-11660cdac447\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331060-f6rbn" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.454635 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aeae2a9-70a6-4813-878c-ea9555215b74-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-b8qt8\" (UID: \"0aeae2a9-70a6-4813-878c-ea9555215b74\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b8qt8" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.462851 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ab4d6007-b58e-4566-9bc5-4d24b761a4ac-signing-key\") pod \"service-ca-9c57cc56f-8xv6b\" (UID: \"ab4d6007-b58e-4566-9bc5-4d24b761a4ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-8xv6b" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.470750 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2xcs\" (UniqueName: \"kubernetes.io/projected/427677ce-ec80-45d0-adff-6ca227d075be-kube-api-access-x2xcs\") pod \"csi-hostpathplugin-mnjlm\" (UID: \"427677ce-ec80-45d0-adff-6ca227d075be\") " pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.477845 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.485823 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2xhp\" (UniqueName: \"kubernetes.io/projected/3950d620-3e88-48fd-823a-f0ab8772ff5b-kube-api-access-p2xhp\") pod \"marketplace-operator-79b997595-9rmtb\" (UID: \"3950d620-3e88-48fd-823a-f0ab8772ff5b\") " pod="openshift-marketplace/marketplace-operator-79b997595-9rmtb" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.502570 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vng26" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.511157 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:43 crc kubenswrapper[4825]: E1007 19:02:43.511686 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:44.011662401 +0000 UTC m=+152.833701038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.516896 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7wv4\" (UniqueName: \"kubernetes.io/projected/f53ff8ef-e493-4e2a-9895-e17bb40c8945-kube-api-access-s7wv4\") pod \"package-server-manager-789f6589d5-dx9x2\" (UID: \"f53ff8ef-e493-4e2a-9895-e17bb40c8945\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dx9x2" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.520490 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jfp2b" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.531207 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fsdc\" (UniqueName: \"kubernetes.io/projected/4c1decdc-5272-4f1d-8504-2d138b4bd138-kube-api-access-8fsdc\") pod \"ingress-canary-mxsfh\" (UID: \"4c1decdc-5272-4f1d-8504-2d138b4bd138\") " pod="openshift-ingress-canary/ingress-canary-mxsfh" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.579164 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9rmtb" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.580013 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82xf9\" (UniqueName: \"kubernetes.io/projected/0fbd1854-844d-4a1e-a44d-35daa5ee8a28-kube-api-access-82xf9\") pod \"openshift-config-operator-7777fb866f-7rf45\" (UID: \"0fbd1854-844d-4a1e-a44d-35daa5ee8a28\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7rf45" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.599855 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmr6w\" (UniqueName: \"kubernetes.io/projected/ab4d6007-b58e-4566-9bc5-4d24b761a4ac-kube-api-access-hmr6w\") pod \"service-ca-9c57cc56f-8xv6b\" (UID: \"ab4d6007-b58e-4566-9bc5-4d24b761a4ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-8xv6b" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.607625 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7lgg\" (UniqueName: \"kubernetes.io/projected/9486410b-2d41-458b-b7d0-1bdd4aeedd09-kube-api-access-w7lgg\") pod \"packageserver-d55dfcdfc-fww5f\" (UID: \"9486410b-2d41-458b-b7d0-1bdd4aeedd09\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fww5f" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.612646 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: E1007 19:02:43.613067 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:44.113050117 +0000 UTC m=+152.935088754 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.638069 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnfts" event={"ID":"12a12314-9e91-4b18-b2d6-f41489add427","Type":"ContainerStarted","Data":"e992697381a3d6bc07150b7998063be64dd28397bb4e0e123f13523a1050cfaa"} Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.643261 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mdzv5" event={"ID":"0581c024-1217-4c9d-b927-45a4327e8eec","Type":"ContainerStarted","Data":"cf7921f8bbb0a4ca4b5b3129c4ea1159c469f007e70e841e0d1025962de060cb"} Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.643300 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mdzv5" event={"ID":"0581c024-1217-4c9d-b927-45a4327e8eec","Type":"ContainerStarted","Data":"ff32dc63aa96ceadb177e5744482cc20496538c325030a58cff358ea4fc46773"} Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.645771 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vmx5\" (UniqueName: \"kubernetes.io/projected/b5f06bf6-da11-4997-b46b-c1abb7030253-kube-api-access-5vmx5\") pod \"multus-admission-controller-857f4d67dd-7dfj7\" (UID: \"b5f06bf6-da11-4997-b46b-c1abb7030253\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7dfj7" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.651407 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" event={"ID":"7c0bb39b-ac5f-48e5-87a8-80f21b338c02","Type":"ContainerStarted","Data":"e3b795932049d99605f3d7812391526232b474db7a7a1128678e64b70335d5b9"} Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.652044 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8xv6b" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.674451 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s9sd\" (UniqueName: \"kubernetes.io/projected/d9e8c599-7697-44cb-84f5-e21f5f88111e-kube-api-access-2s9sd\") pod \"catalog-operator-68c6474976-xmvk6\" (UID: \"d9e8c599-7697-44cb-84f5-e21f5f88111e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmvk6" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.675244 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrcbt\" (UniqueName: \"kubernetes.io/projected/31254990-7f83-47db-93e7-267da242edd3-kube-api-access-wrcbt\") pod \"dns-default-lmk59\" (UID: \"31254990-7f83-47db-93e7-267da242edd3\") " pod="openshift-dns/dns-default-lmk59" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.694084 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hpckv" event={"ID":"f9005f09-0f66-4541-8cb0-725ba2f4380d","Type":"ContainerStarted","Data":"5bdf59d2f7848fd644136592553487febbca3732fc4e1408c98cb4c63b92e394"} Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.694136 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hpckv" event={"ID":"f9005f09-0f66-4541-8cb0-725ba2f4380d","Type":"ContainerStarted","Data":"9d959043512e379e4860c497b68686c30461d645a76b39bb1bceb6f24f92ae16"} Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.695320 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g"] Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.698749 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7"] Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.699384 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-hpckv" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.701364 4825 patch_prober.go:28] interesting pod/downloads-7954f5f757-hpckv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.701408 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hpckv" podUID="f9005f09-0f66-4541-8cb0-725ba2f4380d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.704400 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lfhcp"] Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.718181 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.718999 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lnxqj" event={"ID":"d9758819-ad32-401c-a327-bb0dd9740946","Type":"ContainerStarted","Data":"cdc78b7c7463d1a124da8df09cfade43239c4529f961568326f1ec27c68e0e14"} Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.719049 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lnxqj" event={"ID":"d9758819-ad32-401c-a327-bb0dd9740946","Type":"ContainerStarted","Data":"64871dbcc002afffa181f1b97fff42d3d13a67aa8fb78a7e6ec3a5095b58fae2"} Oct 07 19:02:43 crc kubenswrapper[4825]: E1007 19:02:43.720534 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:44.220508909 +0000 UTC m=+153.042547546 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.724987 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dx9x2" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.737133 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7rf45" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.748296 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sqfnk" event={"ID":"21bd5368-2631-4c6c-94cf-d6e64b1dd657","Type":"ContainerStarted","Data":"e7f4a4e2020b010b2adfee9ce29f148bb32ef260d9b150fb02524dd668d07e56"} Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.753618 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-ll4q5" event={"ID":"5c60c8b3-aa32-442a-a222-3ed689f4dd61","Type":"ContainerStarted","Data":"787b0daf2a8438a41113e75a26ae5c124ac9c403d921d85fbe0e00b5e18f1bb0"} Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.757082 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9m8r\" (UniqueName: \"kubernetes.io/projected/37b43fb6-6cdc-40fd-b4db-d044e8e2d630-kube-api-access-w9m8r\") pod \"migrator-59844c95c7-qwqs9\" (UID: \"37b43fb6-6cdc-40fd-b4db-d044e8e2d630\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qwqs9" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.768119 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" event={"ID":"1d77404b-ecd2-497c-9f7c-ec1ff470755e","Type":"ContainerStarted","Data":"502cc23db4f05c333f81227e1eebee85b811fcea08fadcc61e9b5ffd022b0590"} Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.773603 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qbph\" (UniqueName: \"kubernetes.io/projected/3cb31047-587d-4169-ae87-cd57a2186127-kube-api-access-2qbph\") pod \"machine-config-operator-74547568cd-254wk\" (UID: \"3cb31047-587d-4169-ae87-cd57a2186127\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-254wk" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.774321 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44kv9\" (UniqueName: \"kubernetes.io/projected/c2a1e284-fa53-48bf-a60b-51783dcc8a21-kube-api-access-44kv9\") pod \"machine-config-server-qbvdj\" (UID: \"c2a1e284-fa53-48bf-a60b-51783dcc8a21\") " pod="openshift-machine-config-operator/machine-config-server-qbvdj" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.774771 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wf2m\" (UniqueName: \"kubernetes.io/projected/962646e1-6f06-40ed-a19a-d73f55b93d95-kube-api-access-8wf2m\") pod \"control-plane-machine-set-operator-78cbb6b69f-mhcpj\" (UID: \"962646e1-6f06-40ed-a19a-d73f55b93d95\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mhcpj" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.775254 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0aeae2a9-70a6-4813-878c-ea9555215b74-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-b8qt8\" (UID: \"0aeae2a9-70a6-4813-878c-ea9555215b74\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b8qt8" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.776969 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgddq\" (UniqueName: \"kubernetes.io/projected/629410db-e237-4f48-8ad8-9a1b7d2edfec-kube-api-access-vgddq\") pod \"kube-storage-version-migrator-operator-b67b599dd-qwwdm\" (UID: \"629410db-e237-4f48-8ad8-9a1b7d2edfec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qwwdm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.785739 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mxsfh" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.787751 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtkdg\" (UniqueName: \"kubernetes.io/projected/2f07afbb-46fd-4c4a-b791-2798ecc11ca0-kube-api-access-mtkdg\") pod \"service-ca-operator-777779d784-g8q6q\" (UID: \"2f07afbb-46fd-4c4a-b791-2798ecc11ca0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g8q6q" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.792730 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lmk59" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.804539 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj952\" (UniqueName: \"kubernetes.io/projected/b305d341-f68e-40db-b37c-11660cdac447-kube-api-access-jj952\") pod \"collect-profiles-29331060-f6rbn\" (UID: \"b305d341-f68e-40db-b37c-11660cdac447\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331060-f6rbn" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.821343 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:43 crc kubenswrapper[4825]: E1007 19:02:43.822562 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:44.322549196 +0000 UTC m=+153.144587833 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.833956 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7dfj7" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.843631 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fww5f" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.854918 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmvk6" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.917118 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qwqs9" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.922818 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:43 crc kubenswrapper[4825]: E1007 19:02:43.923220 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:44.42319396 +0000 UTC m=+153.245232597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.934519 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331060-f6rbn" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.944871 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b8qt8" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.962375 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qwwdm" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.971812 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-254wk" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.981653 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g8q6q" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.990730 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mhcpj" Oct 07 19:02:43 crc kubenswrapper[4825]: I1007 19:02:43.998452 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qbvdj" Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.035416 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:44 crc kubenswrapper[4825]: E1007 19:02:44.035727 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:44.535716572 +0000 UTC m=+153.357755209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.136475 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:44 crc kubenswrapper[4825]: E1007 19:02:44.137021 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:44.636980145 +0000 UTC m=+153.459018782 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.238430 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:44 crc kubenswrapper[4825]: E1007 19:02:44.238805 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:44.738792236 +0000 UTC m=+153.560830873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.314456 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jzbkj"] Oct 07 19:02:44 crc kubenswrapper[4825]: E1007 19:02:44.341665 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:44.84163467 +0000 UTC m=+153.663673307 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.342826 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.344164 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:44 crc kubenswrapper[4825]: E1007 19:02:44.344657 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:44.844639616 +0000 UTC m=+153.666678253 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.446858 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:44 crc kubenswrapper[4825]: E1007 19:02:44.447316 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:44.947293593 +0000 UTC m=+153.769332230 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.447474 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:44 crc kubenswrapper[4825]: E1007 19:02:44.447819 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:44.94781306 +0000 UTC m=+153.769851687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.518080 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dmxjs"] Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.549676 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:44 crc kubenswrapper[4825]: E1007 19:02:44.551289 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:45.051215741 +0000 UTC m=+153.873254378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.561160 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-hpckv" podStartSLOduration=133.561125788 podStartE2EDuration="2m13.561125788s" podCreationTimestamp="2025-10-07 19:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:44.522521115 +0000 UTC m=+153.344559752" watchObservedRunningTime="2025-10-07 19:02:44.561125788 +0000 UTC m=+153.383164425" Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.572726 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zt2mx"] Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.579705 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bk8lv"] Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.581570 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5db5x"] Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.591756 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vhgpr"] Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.598457 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pmswz"] Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.652323 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:44 crc kubenswrapper[4825]: E1007 19:02:44.652777 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:45.152762104 +0000 UTC m=+153.974800741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.753943 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:44 crc kubenswrapper[4825]: E1007 19:02:44.754119 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:45.254084559 +0000 UTC m=+154.076123196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.754334 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:44 crc kubenswrapper[4825]: E1007 19:02:44.754750 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:45.25474206 +0000 UTC m=+154.076780697 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.820374 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bk8lv" event={"ID":"bb91d816-c309-4f6c-96b3-79ae595907f7","Type":"ContainerStarted","Data":"ae7c986cc502735308fe743245a48989b5c1cd8948642f3f9e55b02cc0dbb3af"} Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.835041 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" podStartSLOduration=132.835010213 podStartE2EDuration="2m12.835010213s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:44.833091932 +0000 UTC m=+153.655130569" watchObservedRunningTime="2025-10-07 19:02:44.835010213 +0000 UTC m=+153.657048850" Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.844585 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" event={"ID":"a3b4f007-8217-4308-996e-394b0c3d072c","Type":"ContainerStarted","Data":"ec6e0d33aa1e27ad815a68cbb21b68747cadd47f838177b28e609c72ea2a814c"} Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.844641 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" event={"ID":"a3b4f007-8217-4308-996e-394b0c3d072c","Type":"ContainerStarted","Data":"958bd3482e10c9f530a378be409e1bec4735655ac263a940295c21ac720ff270"} Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.845715 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.852130 4825 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-qr8n7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.852222 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" podUID="a3b4f007-8217-4308-996e-394b0c3d072c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.855064 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vng26"] Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.855431 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:44 crc kubenswrapper[4825]: E1007 19:02:44.856157 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:45.356140527 +0000 UTC m=+154.178179154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.883382 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mxsfh"] Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.885103 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnfts" event={"ID":"12a12314-9e91-4b18-b2d6-f41489add427","Type":"ContainerStarted","Data":"7e2a33a5bfac792d43bffe29702017ac2be0946d54a18bceb3dc1c06683be52c"} Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.885154 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnfts" event={"ID":"12a12314-9e91-4b18-b2d6-f41489add427","Type":"ContainerStarted","Data":"c85e1f604202fdf4328d531b3b6bbec47b7110c456a17f6bfe6aee74adf938c6"} Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.896181 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jfp2b" event={"ID":"f870c556-621f-4517-b1df-4e528a96f44f","Type":"ContainerStarted","Data":"51b359b2ea68b82c997ac446216203c66e1e9076cb250c5b674cca5a0058eaa9"} Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.896248 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jfp2b" event={"ID":"f870c556-621f-4517-b1df-4e528a96f44f","Type":"ContainerStarted","Data":"196c478c5d957b29c485fb1fe4186f3e093799aeebec5cee0e54767605ee4df4"} Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.960656 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lnxqj" event={"ID":"d9758819-ad32-401c-a327-bb0dd9740946","Type":"ContainerStarted","Data":"dee4359f8b7847f411ff9e4a90c4239733424b80edddf017292b04f51ccbf452"} Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.963333 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mdzv5" podStartSLOduration=132.963318689 podStartE2EDuration="2m12.963318689s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:44.903502939 +0000 UTC m=+153.725541576" watchObservedRunningTime="2025-10-07 19:02:44.963318689 +0000 UTC m=+153.785357326" Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.965782 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:44 crc kubenswrapper[4825]: E1007 19:02:44.968413 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:45.468395781 +0000 UTC m=+154.290434418 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.973245 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-lfhcp" event={"ID":"4e3cbe84-8ae0-4019-9656-3db9415aee73","Type":"ContainerStarted","Data":"df640d6490169f1fe8cbb36529e18a043b0fe49d09caeb7d60741b84291e0ab1"} Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.977589 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" event={"ID":"a07b3c13-9b79-45d7-a759-e9c119bbe37b","Type":"ContainerStarted","Data":"eebd3ca5547ffd96c0b6a7bae657e784ad992f8ce6c9db020667bf9df330f701"} Oct 07 19:02:44 crc kubenswrapper[4825]: I1007 19:02:44.986923 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dmxjs" event={"ID":"18b317fe-8d88-4915-bc98-89f42c8d4484","Type":"ContainerStarted","Data":"5ef1e71d213825d491fc158f0c27e64c8c69f1d45683c80abf7f08438dd9d976"} Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.014354 4825 generic.go:334] "Generic (PLEG): container finished" podID="1d77404b-ecd2-497c-9f7c-ec1ff470755e" containerID="6b5d6cba28649fdfd03fb34613e0ffd902ef09914171cfb07dac90f329e5dda9" exitCode=0 Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.014572 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" event={"ID":"1d77404b-ecd2-497c-9f7c-ec1ff470755e","Type":"ContainerDied","Data":"6b5d6cba28649fdfd03fb34613e0ffd902ef09914171cfb07dac90f329e5dda9"} Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.040122 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qbvdj" event={"ID":"c2a1e284-fa53-48bf-a60b-51783dcc8a21","Type":"ContainerStarted","Data":"fe63b93100b7c8eb9b5c9770174eedd7e7f510e0651f11028fd19b0bb7a72b3f"} Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.040638 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qbvdj" event={"ID":"c2a1e284-fa53-48bf-a60b-51783dcc8a21","Type":"ContainerStarted","Data":"2459ebbda394f9b5c9a7fac12c3ee0f43a87d8a14ade1f784e74e1902de7c868"} Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.048180 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wrcjq"] Oct 07 19:02:45 crc kubenswrapper[4825]: W1007 19:02:45.060095 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3850383_5213_4cac_ae24_9f403ea96597.slice/crio-24df44852b3c5912cf9c7509655c75bd6b493b962f783ebc96aaeda2d603e4b9 WatchSource:0}: Error finding container 24df44852b3c5912cf9c7509655c75bd6b493b962f783ebc96aaeda2d603e4b9: Status 404 returned error can't find the container with id 24df44852b3c5912cf9c7509655c75bd6b493b962f783ebc96aaeda2d603e4b9 Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.060611 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" event={"ID":"7c0bb39b-ac5f-48e5-87a8-80f21b338c02","Type":"ContainerStarted","Data":"c081e68101ef34da84363844f03f7e12d755d33d0cea5115d38a1c72b3df182d"} Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.060900 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.066832 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:45 crc kubenswrapper[4825]: E1007 19:02:45.067016 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:45.566976939 +0000 UTC m=+154.389015576 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.068093 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:45 crc kubenswrapper[4825]: E1007 19:02:45.069069 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:45.569050915 +0000 UTC m=+154.391089552 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.079890 4825 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-x74mv container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.079963 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" podUID="7c0bb39b-ac5f-48e5-87a8-80f21b338c02" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.083385 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-ll4q5" event={"ID":"5c60c8b3-aa32-442a-a222-3ed689f4dd61","Type":"ContainerStarted","Data":"85ea0a011855237da0b45ffb95d00cd8b490a7ce05481fe8b5922957737517fc"} Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.084817 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-ll4q5" Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.087253 4825 patch_prober.go:28] interesting pod/console-operator-58897d9998-ll4q5 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.087320 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-ll4q5" podUID="5c60c8b3-aa32-442a-a222-3ed689f4dd61" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.089385 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jzbkj" event={"ID":"e92d41cb-ecb1-461c-bdec-314b33ae9d36","Type":"ContainerStarted","Data":"8cc28853a9ccddd1290beab7457a4571c9372445455c07dd714f5aa2948a36d6"} Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.094666 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zt2mx" event={"ID":"8c611070-b4f9-4b32-b436-d8c94d8b09df","Type":"ContainerStarted","Data":"8d509d98c60a604d0d1314b4c9d6d9aad4d852b7a3e90fe61bc64ee68afb138c"} Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.108802 4825 patch_prober.go:28] interesting pod/downloads-7954f5f757-hpckv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.108877 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hpckv" podUID="f9005f09-0f66-4541-8cb0-725ba2f4380d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.109891 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sqfnk" event={"ID":"21bd5368-2631-4c6c-94cf-d6e64b1dd657","Type":"ContainerStarted","Data":"9388e160b342b073ab982191a1c3af015f46c42aa95aa44ebe480d51c73ebeb1"} Oct 07 19:02:45 crc kubenswrapper[4825]: W1007 19:02:45.135468 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dc93346_dc91_4c6e_9567_0b273ed77af7.slice/crio-bb81bcc6e8d3aea67db757ba1e856f363d6a9e835a2b64ebc0a406bcf1827ea4 WatchSource:0}: Error finding container bb81bcc6e8d3aea67db757ba1e856f363d6a9e835a2b64ebc0a406bcf1827ea4: Status 404 returned error can't find the container with id bb81bcc6e8d3aea67db757ba1e856f363d6a9e835a2b64ebc0a406bcf1827ea4 Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.170887 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lmk59"] Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.176240 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:45 crc kubenswrapper[4825]: E1007 19:02:45.182427 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:45.682406384 +0000 UTC m=+154.504445021 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.193426 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:45 crc kubenswrapper[4825]: E1007 19:02:45.193939 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:45.693913522 +0000 UTC m=+154.515952149 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.202491 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9rmtb"] Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.203498 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mnjlm"] Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.267940 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-7hvsq" podStartSLOduration=133.267915444 podStartE2EDuration="2m13.267915444s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:45.259073121 +0000 UTC m=+154.081111758" watchObservedRunningTime="2025-10-07 19:02:45.267915444 +0000 UTC m=+154.089954081" Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.294935 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:45 crc kubenswrapper[4825]: E1007 19:02:45.295480 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:45.795450613 +0000 UTC m=+154.617489250 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.396682 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:45 crc kubenswrapper[4825]: E1007 19:02:45.397043 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:45.897027396 +0000 UTC m=+154.719066033 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.428579 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8xv6b"] Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.437641 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dx9x2"] Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.439931 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7dfj7"] Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.468042 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7rf45"] Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.476848 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmvk6"] Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.481671 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-254wk"] Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.499267 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:45 crc kubenswrapper[4825]: E1007 19:02:45.499828 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:45.999602791 +0000 UTC m=+154.821641428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.501397 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fww5f"] Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.502071 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" podStartSLOduration=133.50205067 podStartE2EDuration="2m13.50205067s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:45.498585439 +0000 UTC m=+154.320624076" watchObservedRunningTime="2025-10-07 19:02:45.50205067 +0000 UTC m=+154.324089307" Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.528541 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-jfp2b" Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.532995 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qwqs9"] Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.536162 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qwwdm"] Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.550053 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331060-f6rbn"] Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.553814 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mhcpj"] Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.558351 4825 patch_prober.go:28] interesting pod/router-default-5444994796-jfp2b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 19:02:45 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Oct 07 19:02:45 crc kubenswrapper[4825]: [+]process-running ok Oct 07 19:02:45 crc kubenswrapper[4825]: healthz check failed Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.558431 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jfp2b" podUID="f870c556-621f-4517-b1df-4e528a96f44f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.569454 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b8qt8"] Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.583090 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-g8q6q"] Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.587104 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-sqfnk" podStartSLOduration=134.587078314 podStartE2EDuration="2m14.587078314s" podCreationTimestamp="2025-10-07 19:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:45.579775751 +0000 UTC m=+154.401814388" watchObservedRunningTime="2025-10-07 19:02:45.587078314 +0000 UTC m=+154.409116941" Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.600553 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:45 crc kubenswrapper[4825]: E1007 19:02:45.600837 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:46.100825413 +0000 UTC m=+154.922864050 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.649242 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-jfp2b" podStartSLOduration=133.649185117 podStartE2EDuration="2m13.649185117s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:45.620635496 +0000 UTC m=+154.442674133" watchObservedRunningTime="2025-10-07 19:02:45.649185117 +0000 UTC m=+154.471223824" Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.685142 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-qbvdj" podStartSLOduration=5.685104534 podStartE2EDuration="5.685104534s" podCreationTimestamp="2025-10-07 19:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:45.667831962 +0000 UTC m=+154.489870599" watchObservedRunningTime="2025-10-07 19:02:45.685104534 +0000 UTC m=+154.507143171" Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.701512 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:45 crc kubenswrapper[4825]: E1007 19:02:45.701912 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:46.20188719 +0000 UTC m=+155.023925827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.704872 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-ll4q5" podStartSLOduration=134.704842434 podStartE2EDuration="2m14.704842434s" podCreationTimestamp="2025-10-07 19:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:45.698205563 +0000 UTC m=+154.520244220" watchObservedRunningTime="2025-10-07 19:02:45.704842434 +0000 UTC m=+154.526881071" Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.751161 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" podStartSLOduration=133.751129803 podStartE2EDuration="2m13.751129803s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:45.750268975 +0000 UTC m=+154.572307602" watchObservedRunningTime="2025-10-07 19:02:45.751129803 +0000 UTC m=+154.573168440" Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.786650 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jzbkj" podStartSLOduration=133.786632136 podStartE2EDuration="2m13.786632136s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:45.785441527 +0000 UTC m=+154.607480164" watchObservedRunningTime="2025-10-07 19:02:45.786632136 +0000 UTC m=+154.608670773" Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.803024 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:45 crc kubenswrapper[4825]: E1007 19:02:45.803396 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:46.303382581 +0000 UTC m=+155.125421218 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.823842 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lnxqj" podStartSLOduration=134.823823884 podStartE2EDuration="2m14.823823884s" podCreationTimestamp="2025-10-07 19:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:45.823184683 +0000 UTC m=+154.645223340" watchObservedRunningTime="2025-10-07 19:02:45.823823884 +0000 UTC m=+154.645862521" Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.887875 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnfts" podStartSLOduration=134.887853348 podStartE2EDuration="2m14.887853348s" podCreationTimestamp="2025-10-07 19:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:45.865701371 +0000 UTC m=+154.687740008" watchObservedRunningTime="2025-10-07 19:02:45.887853348 +0000 UTC m=+154.709891985" Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.906192 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:45 crc kubenswrapper[4825]: E1007 19:02:45.906347 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:46.406297216 +0000 UTC m=+155.228335853 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:45 crc kubenswrapper[4825]: I1007 19:02:45.906972 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:45 crc kubenswrapper[4825]: E1007 19:02:45.907532 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:46.407518876 +0000 UTC m=+155.229557503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.008004 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:46 crc kubenswrapper[4825]: E1007 19:02:46.008297 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:46.508283403 +0000 UTC m=+155.330322040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.110278 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:46 crc kubenswrapper[4825]: E1007 19:02:46.110920 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:46.610907559 +0000 UTC m=+155.432946196 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.129503 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mxsfh" event={"ID":"4c1decdc-5272-4f1d-8504-2d138b4bd138","Type":"ContainerStarted","Data":"e031d66d57164f996757da65597f94e7793b2af85c6c95420a8b9c542a012c5e"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.129548 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mxsfh" event={"ID":"4c1decdc-5272-4f1d-8504-2d138b4bd138","Type":"ContainerStarted","Data":"4bc68da7e9bd2dd49509bbc1f8c7157af201be606912234326f4368db021d40c"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.131099 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qwqs9" event={"ID":"37b43fb6-6cdc-40fd-b4db-d044e8e2d630","Type":"ContainerStarted","Data":"51366c75ea282d7dfbf53e9f02f4838b6b5fd2e31677c7c71219972c80819ddc"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.134735 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhgpr" event={"ID":"dc20bdf9-7763-495f-b56e-1bf9ff56686e","Type":"ContainerStarted","Data":"9416f1a1f6d0910801745b13c3dd91fb062907cc97c525bd700dad7450e401a7"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.134761 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhgpr" event={"ID":"dc20bdf9-7763-495f-b56e-1bf9ff56686e","Type":"ContainerStarted","Data":"768d654c95bd63970ade2d4ca479dab4723e228107e822d56bd8061d4aef917e"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.134773 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhgpr" event={"ID":"dc20bdf9-7763-495f-b56e-1bf9ff56686e","Type":"ContainerStarted","Data":"59328695ff2f8b522df841171a769bdae301a13b9682c3ae9d709bf58c6de5da"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.135680 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331060-f6rbn" event={"ID":"b305d341-f68e-40db-b37c-11660cdac447","Type":"ContainerStarted","Data":"2a23e2ebbe7ba015cf7e5c0355b42f349bc8f63f31eeb99a0ca383652cc7df2c"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.137852 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fww5f" event={"ID":"9486410b-2d41-458b-b7d0-1bdd4aeedd09","Type":"ContainerStarted","Data":"d887475eebdba9b5799dc0a0b57bccdf4b0f4af273584287eb265e624a6278aa"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.140667 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" event={"ID":"1d77404b-ecd2-497c-9f7c-ec1ff470755e","Type":"ContainerStarted","Data":"3ed845718b2597f012ef8168ece2a10346e3b1cece5f4ece7b462afbe86e7ec2"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.156571 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dx9x2" event={"ID":"f53ff8ef-e493-4e2a-9895-e17bb40c8945","Type":"ContainerStarted","Data":"12e8650c0e08973213270dd8a1862a3b8aeeee47281f6dcebeda9d5a5f5be547"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.158966 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" event={"ID":"427677ce-ec80-45d0-adff-6ca227d075be","Type":"ContainerStarted","Data":"0d8ca7664407257f010a3a3e473d68e989e3578e8b4e40f65c3c48b7144164c9"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.161196 4825 generic.go:334] "Generic (PLEG): container finished" podID="a07b3c13-9b79-45d7-a759-e9c119bbe37b" containerID="53996480464c7a2a5311bddbdfbf5ee42b89c062c766af0dd15fd70dc67242e1" exitCode=0 Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.161611 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" event={"ID":"a07b3c13-9b79-45d7-a759-e9c119bbe37b","Type":"ContainerDied","Data":"53996480464c7a2a5311bddbdfbf5ee42b89c062c766af0dd15fd70dc67242e1"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.164528 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmvk6" event={"ID":"d9e8c599-7697-44cb-84f5-e21f5f88111e","Type":"ContainerStarted","Data":"a7d3e564157aad4a9bf202f5b51913612f440a2189a6a184ea902f6dd825d7c2"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.168790 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-lfhcp" event={"ID":"4e3cbe84-8ae0-4019-9656-3db9415aee73","Type":"ContainerStarted","Data":"17a3e4c5197aec9d42c5e3d16f961e8ac70aceaeba8afcb3b9de197074079fdd"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.173030 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bk8lv" event={"ID":"bb91d816-c309-4f6c-96b3-79ae595907f7","Type":"ContainerStarted","Data":"835cd2b912d599560229d6b3f12b9d04331ed923ef1314699c7d6a3c6a0bc806"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.206938 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jzbkj" event={"ID":"e92d41cb-ecb1-461c-bdec-314b33ae9d36","Type":"ContainerStarted","Data":"8f82d041eae05bdff85633d0caf05e736b7e30cbc4845eed5a12a03d81d938d8"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.213496 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:46 crc kubenswrapper[4825]: E1007 19:02:46.217625 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:46.717576176 +0000 UTC m=+155.539614813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.218150 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:46 crc kubenswrapper[4825]: E1007 19:02:46.218316 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:46.718291579 +0000 UTC m=+155.540330216 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.256002 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wrcjq" event={"ID":"1dc93346-dc91-4c6e-9567-0b273ed77af7","Type":"ContainerStarted","Data":"f8348f05962c48dfa076a1cf2ae67a8538fc1200849e6f8038a90d5a79559cde"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.256135 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wrcjq" event={"ID":"1dc93346-dc91-4c6e-9567-0b273ed77af7","Type":"ContainerStarted","Data":"bb81bcc6e8d3aea67db757ba1e856f363d6a9e835a2b64ebc0a406bcf1827ea4"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.271018 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pmswz" event={"ID":"ac79181c-997a-4974-9fdd-aea6b3f2903c","Type":"ContainerStarted","Data":"94d55ca2b174681b8482abb836a3bd3bcacd016a83a3b92dd3594008e3d99940"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.271077 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pmswz" event={"ID":"ac79181c-997a-4974-9fdd-aea6b3f2903c","Type":"ContainerStarted","Data":"45c3c1b50a5c8162d6ae3fdccbd3bda435013cd3a5e4d18ca52b523f0e28c0d0"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.273469 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-254wk" event={"ID":"3cb31047-587d-4169-ae87-cd57a2186127","Type":"ContainerStarted","Data":"40429a0eb77e19f11c3d2a2f174790c0d52589198bef40f6b53d0d7cfd3b818e"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.284705 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vng26" event={"ID":"e3850383-5213-4cac-ae24-9f403ea96597","Type":"ContainerStarted","Data":"bbfd5a3d2a5ffe8e99db287bd34329740e0346b2dfba2856a795ff22f7f6d4d6"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.284748 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vng26" event={"ID":"e3850383-5213-4cac-ae24-9f403ea96597","Type":"ContainerStarted","Data":"24df44852b3c5912cf9c7509655c75bd6b493b962f783ebc96aaeda2d603e4b9"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.287758 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lmk59" event={"ID":"31254990-7f83-47db-93e7-267da242edd3","Type":"ContainerStarted","Data":"7d140f644835786756f4d3dd2b7c1a4cbaffd123a0d2362294f3ef5b33c5486a"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.287807 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lmk59" event={"ID":"31254990-7f83-47db-93e7-267da242edd3","Type":"ContainerStarted","Data":"f735e1302d300f25b2b29f58fcd9f93317f657eb5fc144b7e77f192bcb49e100"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.289962 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7rf45" event={"ID":"0fbd1854-844d-4a1e-a44d-35daa5ee8a28","Type":"ContainerStarted","Data":"2a8bd4690db6695c7b58ebdf080efad8c135f6f44851c067edaae7ff01def997"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.292649 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8xv6b" event={"ID":"ab4d6007-b58e-4566-9bc5-4d24b761a4ac","Type":"ContainerStarted","Data":"f0ca43b1f953fcfc2e0b0e728f707d1408a41912d00fbd248474e7c035733f84"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.295867 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g8q6q" event={"ID":"2f07afbb-46fd-4c4a-b791-2798ecc11ca0","Type":"ContainerStarted","Data":"49cf4833edd885cb686cccbb9081803057f69f789710220c9d9f2dc351307019"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.297643 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7dfj7" event={"ID":"b5f06bf6-da11-4997-b46b-c1abb7030253","Type":"ContainerStarted","Data":"d3baca19e3003430c1b9352adcfe1c93fd2fb465edb1c62dd58dffc4477b93f7"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.306678 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b8qt8" event={"ID":"0aeae2a9-70a6-4813-878c-ea9555215b74","Type":"ContainerStarted","Data":"b3c3a35ebd5aea8cb59246cc764ebc77b0e604a49cca1ccbc8623a79d10db6ce"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.315772 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qwwdm" event={"ID":"629410db-e237-4f48-8ad8-9a1b7d2edfec","Type":"ContainerStarted","Data":"bbaa0b92704338fd40b3ece9316419610c13f504c367a307b8afc14c2bb50af3"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.318329 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zt2mx" event={"ID":"8c611070-b4f9-4b32-b436-d8c94d8b09df","Type":"ContainerStarted","Data":"9527030b87d18c0eb46f356a139dfca8247f910e521086332497a11fda4c2589"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.319289 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zt2mx" Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.319645 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:46 crc kubenswrapper[4825]: E1007 19:02:46.320960 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:46.820945226 +0000 UTC m=+155.642983863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.337895 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mhcpj" event={"ID":"962646e1-6f06-40ed-a19a-d73f55b93d95","Type":"ContainerStarted","Data":"8d32821938673bd9c52468b1647ad1e4ac15873c86f3011d32b0353af3475384"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.339564 4825 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-zt2mx container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.339619 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zt2mx" podUID="8c611070-b4f9-4b32-b436-d8c94d8b09df" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.371063 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5db5x" event={"ID":"dbf5fc3f-5709-48d6-a0d0-b8406a396b00","Type":"ContainerStarted","Data":"b5e250c720cddc67224021049c7abee23c0fa8d95ebb92ba19368a8185eda949"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.371437 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5db5x" event={"ID":"dbf5fc3f-5709-48d6-a0d0-b8406a396b00","Type":"ContainerStarted","Data":"b0bf66a794b5f89c6764b985e1322b56f7865894dc30cc4c4234440aef7ff065"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.400660 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9rmtb" event={"ID":"3950d620-3e88-48fd-823a-f0ab8772ff5b","Type":"ContainerStarted","Data":"58db47d59ece8247419d04606990a836c33261d0b4d7baf611bfbaa951480e80"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.400854 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9rmtb" event={"ID":"3950d620-3e88-48fd-823a-f0ab8772ff5b","Type":"ContainerStarted","Data":"06aee62d4b265805a18886720fae2dcfa719d50ec4ee1f62f80e889f8d6bd5fb"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.401928 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-9rmtb" Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.420973 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.423204 4825 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-9rmtb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.423277 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-9rmtb" podUID="3950d620-3e88-48fd-823a-f0ab8772ff5b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Oct 07 19:02:46 crc kubenswrapper[4825]: E1007 19:02:46.425037 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:46.925022039 +0000 UTC m=+155.747060676 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.427172 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dmxjs" event={"ID":"18b317fe-8d88-4915-bc98-89f42c8d4484","Type":"ContainerStarted","Data":"e6ed3ed70db44d9fe7e5b4dd236d70516ada9afdcc1c46be143baea7d045c848"} Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.458994 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.459192 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-ll4q5" Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.476861 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.529110 4825 patch_prober.go:28] interesting pod/router-default-5444994796-jfp2b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 19:02:46 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Oct 07 19:02:46 crc kubenswrapper[4825]: [+]process-running ok Oct 07 19:02:46 crc kubenswrapper[4825]: healthz check failed Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.529161 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jfp2b" podUID="f870c556-621f-4517-b1df-4e528a96f44f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.530027 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:46 crc kubenswrapper[4825]: E1007 19:02:46.531542 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:47.031526009 +0000 UTC m=+155.853564646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.633151 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:46 crc kubenswrapper[4825]: E1007 19:02:46.634005 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:47.133993961 +0000 UTC m=+155.956032598 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.735675 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:46 crc kubenswrapper[4825]: E1007 19:02:46.736247 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:47.236219575 +0000 UTC m=+156.058258212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.778242 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bk8lv" podStartSLOduration=134.778212926 podStartE2EDuration="2m14.778212926s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:46.776709437 +0000 UTC m=+155.598748074" watchObservedRunningTime="2025-10-07 19:02:46.778212926 +0000 UTC m=+155.600251563" Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.838026 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:46 crc kubenswrapper[4825]: E1007 19:02:46.838637 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:47.338624784 +0000 UTC m=+156.160663421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.884302 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-wrcjq" podStartSLOduration=134.884284672 podStartE2EDuration="2m14.884284672s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:46.850450452 +0000 UTC m=+155.672489089" watchObservedRunningTime="2025-10-07 19:02:46.884284672 +0000 UTC m=+155.706323309" Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.884434 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dmxjs" podStartSLOduration=134.884430397 podStartE2EDuration="2m14.884430397s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:46.881345488 +0000 UTC m=+155.703384125" watchObservedRunningTime="2025-10-07 19:02:46.884430397 +0000 UTC m=+155.706469034" Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.939917 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-9rmtb" podStartSLOduration=134.939899498 podStartE2EDuration="2m14.939899498s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:46.937918155 +0000 UTC m=+155.759956792" watchObservedRunningTime="2025-10-07 19:02:46.939899498 +0000 UTC m=+155.761938135" Oct 07 19:02:46 crc kubenswrapper[4825]: I1007 19:02:46.945959 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:46 crc kubenswrapper[4825]: E1007 19:02:46.946992 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:47.446959113 +0000 UTC m=+156.268997750 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.011254 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zt2mx" podStartSLOduration=135.011217125 podStartE2EDuration="2m15.011217125s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:47.009466659 +0000 UTC m=+155.831505296" watchObservedRunningTime="2025-10-07 19:02:47.011217125 +0000 UTC m=+155.833255762" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.049477 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:47 crc kubenswrapper[4825]: E1007 19:02:47.050147 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:47.550134088 +0000 UTC m=+156.372172725 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.051201 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-mxsfh" podStartSLOduration=7.05115277 podStartE2EDuration="7.05115277s" podCreationTimestamp="2025-10-07 19:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:47.047748891 +0000 UTC m=+155.869787528" watchObservedRunningTime="2025-10-07 19:02:47.05115277 +0000 UTC m=+155.873204357" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.152073 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:47 crc kubenswrapper[4825]: E1007 19:02:47.152475 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:47.652459714 +0000 UTC m=+156.474498351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.242944 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhgpr" podStartSLOduration=135.242923483 podStartE2EDuration="2m15.242923483s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:47.213355148 +0000 UTC m=+156.035393785" watchObservedRunningTime="2025-10-07 19:02:47.242923483 +0000 UTC m=+156.064962120" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.254379 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:47 crc kubenswrapper[4825]: E1007 19:02:47.255170 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:47.755140062 +0000 UTC m=+156.577178689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.269431 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.285478 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-lfhcp" podStartSLOduration=135.285455581 podStartE2EDuration="2m15.285455581s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:47.245907708 +0000 UTC m=+156.067946335" watchObservedRunningTime="2025-10-07 19:02:47.285455581 +0000 UTC m=+156.107494218" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.286452 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5db5x" podStartSLOduration=136.286446612 podStartE2EDuration="2m16.286446612s" podCreationTimestamp="2025-10-07 19:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:47.281071401 +0000 UTC m=+156.103110038" watchObservedRunningTime="2025-10-07 19:02:47.286446612 +0000 UTC m=+156.108485249" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.312985 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vng26" podStartSLOduration=135.312956409 podStartE2EDuration="2m15.312956409s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:47.309793528 +0000 UTC m=+156.131832165" watchObservedRunningTime="2025-10-07 19:02:47.312956409 +0000 UTC m=+156.134995046" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.357643 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:47 crc kubenswrapper[4825]: E1007 19:02:47.359398 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:47.859375771 +0000 UTC m=+156.681414408 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.442678 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8xv6b" event={"ID":"ab4d6007-b58e-4566-9bc5-4d24b761a4ac","Type":"ContainerStarted","Data":"e937b7cc30d7d1b552003e1244cae2fdff3d3d467e4de58855ba56afa2b95e9f"} Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.451116 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g8q6q" event={"ID":"2f07afbb-46fd-4c4a-b791-2798ecc11ca0","Type":"ContainerStarted","Data":"803048d3b9d02a9beac02586d4ebf31fa7c565d8fd69f98cd30ee20fd483fe25"} Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.455194 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b8qt8" event={"ID":"0aeae2a9-70a6-4813-878c-ea9555215b74","Type":"ContainerStarted","Data":"ef96a55b474f089b3fc5ad2279639e88a88d57a5f4b45195918d4cbed8035849"} Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.459059 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-254wk" event={"ID":"3cb31047-587d-4169-ae87-cd57a2186127","Type":"ContainerStarted","Data":"a0ce4546b88500d1c33ea38a672c2605e7fbb0ccaef22613fa6f378ae5d1226d"} Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.459122 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-254wk" event={"ID":"3cb31047-587d-4169-ae87-cd57a2186127","Type":"ContainerStarted","Data":"3868670f58ebcb3a8db4e6633ec745e5fa02944d53b2dd76ca8eef9d556b6693"} Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.460085 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:47 crc kubenswrapper[4825]: E1007 19:02:47.460388 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:47.960375936 +0000 UTC m=+156.782414573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.463372 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-8xv6b" podStartSLOduration=135.463351961 podStartE2EDuration="2m15.463351961s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:47.459739075 +0000 UTC m=+156.281777712" watchObservedRunningTime="2025-10-07 19:02:47.463351961 +0000 UTC m=+156.285390598" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.466826 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fww5f" event={"ID":"9486410b-2d41-458b-b7d0-1bdd4aeedd09","Type":"ContainerStarted","Data":"7b99654ca503bf9d0b31b68506d7350804e4083e15629f883b54cd80cb67b678"} Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.467708 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fww5f" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.469532 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" event={"ID":"427677ce-ec80-45d0-adff-6ca227d075be","Type":"ContainerStarted","Data":"4e371cac8ff496a5bf685746aaeb9b0833ac1daa8bf7a5b9a6f67c005f1a0d0c"} Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.475465 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmvk6" event={"ID":"d9e8c599-7697-44cb-84f5-e21f5f88111e","Type":"ContainerStarted","Data":"5d2fc504623106c0adfe201b102c7e56e3556c3895ac04e48773e51bb75843ce"} Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.476471 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmvk6" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.477547 4825 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xmvk6 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.477603 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmvk6" podUID="d9e8c599-7697-44cb-84f5-e21f5f88111e" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.479176 4825 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fww5f container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.479204 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fww5f" podUID="9486410b-2d41-458b-b7d0-1bdd4aeedd09" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.479266 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qwwdm" event={"ID":"629410db-e237-4f48-8ad8-9a1b7d2edfec","Type":"ContainerStarted","Data":"b1a4e0d640091927804b5290ca537ad8efdbe59fe5dcfb3a79ba539513e94e60"} Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.483417 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" event={"ID":"1d77404b-ecd2-497c-9f7c-ec1ff470755e","Type":"ContainerStarted","Data":"245dfef2e6711e35935ca44fafe01fea0182f4a7124af8d74861eff32c024cf6"} Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.484548 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-254wk" podStartSLOduration=135.484535927 podStartE2EDuration="2m15.484535927s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:47.482860493 +0000 UTC m=+156.304899130" watchObservedRunningTime="2025-10-07 19:02:47.484535927 +0000 UTC m=+156.306574564" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.484869 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mhcpj" event={"ID":"962646e1-6f06-40ed-a19a-d73f55b93d95","Type":"ContainerStarted","Data":"0b54aad862d5f393d8c29eddaf04aa548f7722770e18878e0876c61aee895e60"} Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.492531 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lmk59" event={"ID":"31254990-7f83-47db-93e7-267da242edd3","Type":"ContainerStarted","Data":"3efc28030f82544c2905cf8434ad0e3662d8a51324ce2284d0bdf0aa53f1d3a7"} Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.492964 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-lmk59" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.502801 4825 generic.go:334] "Generic (PLEG): container finished" podID="0fbd1854-844d-4a1e-a44d-35daa5ee8a28" containerID="830c77c63ad7c8202d8f85d135f1486c21366b6fdf3d7a2c98bfd30349ff3a8e" exitCode=0 Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.503105 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7rf45" event={"ID":"0fbd1854-844d-4a1e-a44d-35daa5ee8a28","Type":"ContainerDied","Data":"830c77c63ad7c8202d8f85d135f1486c21366b6fdf3d7a2c98bfd30349ff3a8e"} Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.507633 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b8qt8" podStartSLOduration=135.507609223 podStartE2EDuration="2m15.507609223s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:47.504288287 +0000 UTC m=+156.326326924" watchObservedRunningTime="2025-10-07 19:02:47.507609223 +0000 UTC m=+156.329647860" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.527711 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" event={"ID":"a07b3c13-9b79-45d7-a759-e9c119bbe37b","Type":"ContainerStarted","Data":"d328574aa9908a3b56dd80ede02c1995e8d7a95376f1915a483a7ff919945572"} Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.540815 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g8q6q" podStartSLOduration=135.540797043 podStartE2EDuration="2m15.540797043s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:47.539602145 +0000 UTC m=+156.361640782" watchObservedRunningTime="2025-10-07 19:02:47.540797043 +0000 UTC m=+156.362835680" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.541793 4825 patch_prober.go:28] interesting pod/router-default-5444994796-jfp2b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 19:02:47 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Oct 07 19:02:47 crc kubenswrapper[4825]: [+]process-running ok Oct 07 19:02:47 crc kubenswrapper[4825]: healthz check failed Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.541877 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jfp2b" podUID="f870c556-621f-4517-b1df-4e528a96f44f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.542150 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dx9x2" event={"ID":"f53ff8ef-e493-4e2a-9895-e17bb40c8945","Type":"ContainerStarted","Data":"be164a0f3625ada2d64f4421f13bda0cd52b139475c82bcc5dd39837bef94d90"} Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.542182 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dx9x2" event={"ID":"f53ff8ef-e493-4e2a-9895-e17bb40c8945","Type":"ContainerStarted","Data":"e911752d98ab630f7c8098f456faec8cd2e10a210a28b1014c5c45445caac98c"} Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.542998 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dx9x2" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.560755 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.561251 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7dfj7" event={"ID":"b5f06bf6-da11-4997-b46b-c1abb7030253","Type":"ContainerStarted","Data":"537d7de7c2fcc512d7bb16c1bd68b63572450c1b66a2faae8533a10fa4ec93da"} Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.561282 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7dfj7" event={"ID":"b5f06bf6-da11-4997-b46b-c1abb7030253","Type":"ContainerStarted","Data":"87661eed5ee3dd2d16c5c656fd013b133b662541569f7ebdcb9f71296b0165e5"} Oct 07 19:02:47 crc kubenswrapper[4825]: E1007 19:02:47.561396 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:48.061380731 +0000 UTC m=+156.883419368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.569325 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mhcpj" podStartSLOduration=135.569309123 podStartE2EDuration="2m15.569309123s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:47.56792422 +0000 UTC m=+156.389962857" watchObservedRunningTime="2025-10-07 19:02:47.569309123 +0000 UTC m=+156.391347760" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.584784 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dmxjs" event={"ID":"18b317fe-8d88-4915-bc98-89f42c8d4484","Type":"ContainerStarted","Data":"364323c7243444f4fa18ac7a8cbf9ad0c4124570f6d6a01cca3fc21a82c926f9"} Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.599429 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331060-f6rbn" event={"ID":"b305d341-f68e-40db-b37c-11660cdac447","Type":"ContainerStarted","Data":"113f6d9686877bcacbab22951fcbeec4eb56b70baa2a93dc1c54271a1ca7e34f"} Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.622948 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pmswz" event={"ID":"ac79181c-997a-4974-9fdd-aea6b3f2903c","Type":"ContainerStarted","Data":"2221d54e21dcc7a9ed164550f87ac38deaff6ac9415ecbe98b6590cfdefc0d66"} Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.653761 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qwwdm" podStartSLOduration=135.653736719 podStartE2EDuration="2m15.653736719s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:47.596922405 +0000 UTC m=+156.418961052" watchObservedRunningTime="2025-10-07 19:02:47.653736719 +0000 UTC m=+156.475775356" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.654177 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" podStartSLOduration=135.654172253 podStartE2EDuration="2m15.654172253s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:47.649017439 +0000 UTC m=+156.471056076" watchObservedRunningTime="2025-10-07 19:02:47.654172253 +0000 UTC m=+156.476210890" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.654793 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qwqs9" event={"ID":"37b43fb6-6cdc-40fd-b4db-d044e8e2d630","Type":"ContainerStarted","Data":"b5f6760b29cf6be61458ba05022b6d6a4731daaae59df81ec7dc17a3b7e40281"} Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.654830 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qwqs9" event={"ID":"37b43fb6-6cdc-40fd-b4db-d044e8e2d630","Type":"ContainerStarted","Data":"c88c0bca13cc37ebae56915c85e6e4123a70e1565f277a566a5a23de1c6ecf4e"} Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.655678 4825 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-9rmtb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.655715 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-9rmtb" podUID="3950d620-3e88-48fd-823a-f0ab8772ff5b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.661443 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zt2mx" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.673941 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:47 crc kubenswrapper[4825]: E1007 19:02:47.676072 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:48.176054022 +0000 UTC m=+156.998092659 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.768387 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.768950 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.773392 4825 patch_prober.go:28] interesting pod/apiserver-76f77b778f-rp8vt container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.7:8443/livez\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.773558 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" podUID="1d77404b-ecd2-497c-9f7c-ec1ff470755e" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/livez\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.779945 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.780155 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmvk6" podStartSLOduration=135.780135165 podStartE2EDuration="2m15.780135165s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:47.77936394 +0000 UTC m=+156.601402587" watchObservedRunningTime="2025-10-07 19:02:47.780135165 +0000 UTC m=+156.602173802" Oct 07 19:02:47 crc kubenswrapper[4825]: E1007 19:02:47.782539 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:48.282516431 +0000 UTC m=+157.104555068 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.785895 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lmk59" podStartSLOduration=7.785879028 podStartE2EDuration="7.785879028s" podCreationTimestamp="2025-10-07 19:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:47.729534109 +0000 UTC m=+156.551572746" watchObservedRunningTime="2025-10-07 19:02:47.785879028 +0000 UTC m=+156.607917665" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.825371 4825 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-dh96g container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.825437 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" podUID="a07b3c13-9b79-45d7-a759-e9c119bbe37b" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.840258 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.840290 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.884108 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:47 crc kubenswrapper[4825]: E1007 19:02:47.884441 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:48.384429725 +0000 UTC m=+157.206468362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.896161 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fww5f" podStartSLOduration=135.896142058 podStartE2EDuration="2m15.896142058s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:47.864692095 +0000 UTC m=+156.686730732" watchObservedRunningTime="2025-10-07 19:02:47.896142058 +0000 UTC m=+156.718180695" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.940300 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" podStartSLOduration=135.940284148 podStartE2EDuration="2m15.940284148s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:47.933022156 +0000 UTC m=+156.755060793" watchObservedRunningTime="2025-10-07 19:02:47.940284148 +0000 UTC m=+156.762322785" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.961749 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dx9x2" podStartSLOduration=135.961719452 podStartE2EDuration="2m15.961719452s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:47.959896435 +0000 UTC m=+156.781935072" watchObservedRunningTime="2025-10-07 19:02:47.961719452 +0000 UTC m=+156.783758089" Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.985661 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:47 crc kubenswrapper[4825]: E1007 19:02:47.985798 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:48.48577923 +0000 UTC m=+157.307817867 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.985927 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:47 crc kubenswrapper[4825]: E1007 19:02:47.986249 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:48.486240915 +0000 UTC m=+157.308279542 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:47 crc kubenswrapper[4825]: I1007 19:02:47.999202 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-pmswz" podStartSLOduration=135.999181799 podStartE2EDuration="2m15.999181799s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:47.9948426 +0000 UTC m=+156.816881237" watchObservedRunningTime="2025-10-07 19:02:47.999181799 +0000 UTC m=+156.821220436" Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.081018 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qwqs9" podStartSLOduration=136.081000641 podStartE2EDuration="2m16.081000641s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:48.074367599 +0000 UTC m=+156.896406246" watchObservedRunningTime="2025-10-07 19:02:48.081000641 +0000 UTC m=+156.903039278" Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.087504 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:48 crc kubenswrapper[4825]: E1007 19:02:48.087725 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:48.587696385 +0000 UTC m=+157.409735022 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.087783 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:48 crc kubenswrapper[4825]: E1007 19:02:48.088074 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:48.588063977 +0000 UTC m=+157.410102614 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.181142 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29331060-f6rbn" podStartSLOduration=137.181112338 podStartE2EDuration="2m17.181112338s" podCreationTimestamp="2025-10-07 19:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:48.162065349 +0000 UTC m=+156.984103986" watchObservedRunningTime="2025-10-07 19:02:48.181112338 +0000 UTC m=+157.003150975" Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.182907 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-7dfj7" podStartSLOduration=136.182902204 podStartE2EDuration="2m16.182902204s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:48.181571111 +0000 UTC m=+157.003609748" watchObservedRunningTime="2025-10-07 19:02:48.182902204 +0000 UTC m=+157.004940841" Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.188996 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:48 crc kubenswrapper[4825]: E1007 19:02:48.189201 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:48.689167415 +0000 UTC m=+157.511206052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.189370 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:48 crc kubenswrapper[4825]: E1007 19:02:48.189743 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:48.689735653 +0000 UTC m=+157.511774290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.290416 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:48 crc kubenswrapper[4825]: E1007 19:02:48.290614 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:48.790583443 +0000 UTC m=+157.612622080 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.290773 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:48 crc kubenswrapper[4825]: E1007 19:02:48.291252 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:48.791218432 +0000 UTC m=+157.613257069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.391967 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:48 crc kubenswrapper[4825]: E1007 19:02:48.392280 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:48.892244888 +0000 UTC m=+157.714283525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.392399 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:48 crc kubenswrapper[4825]: E1007 19:02:48.392840 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:48.892821067 +0000 UTC m=+157.714859714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.493964 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:48 crc kubenswrapper[4825]: E1007 19:02:48.494132 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:48.99410064 +0000 UTC m=+157.816139277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.494209 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:48 crc kubenswrapper[4825]: E1007 19:02:48.494538 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:48.994530934 +0000 UTC m=+157.816569571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.527359 4825 patch_prober.go:28] interesting pod/router-default-5444994796-jfp2b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 19:02:48 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Oct 07 19:02:48 crc kubenswrapper[4825]: [+]process-running ok Oct 07 19:02:48 crc kubenswrapper[4825]: healthz check failed Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.527464 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jfp2b" podUID="f870c556-621f-4517-b1df-4e528a96f44f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.595478 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:48 crc kubenswrapper[4825]: E1007 19:02:48.595652 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:49.095625472 +0000 UTC m=+157.917664109 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.595803 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:48 crc kubenswrapper[4825]: E1007 19:02:48.596145 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:49.096132678 +0000 UTC m=+157.918171315 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.678704 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7rf45" event={"ID":"0fbd1854-844d-4a1e-a44d-35daa5ee8a28","Type":"ContainerStarted","Data":"c3cbcdb1d80f575156e900201b4598a90b174ec728f632cddb2c9158e61050eb"} Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.679251 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7rf45" Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.696521 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:48 crc kubenswrapper[4825]: E1007 19:02:48.696746 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:49.196713869 +0000 UTC m=+158.018752506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.696850 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:48 crc kubenswrapper[4825]: E1007 19:02:48.697120 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:49.197109532 +0000 UTC m=+158.019148169 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.697407 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" event={"ID":"427677ce-ec80-45d0-adff-6ca227d075be","Type":"ContainerStarted","Data":"93eeb79f194a0e96d2274528deb360196f1b8055338f5b37ecca061df4912a20"} Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.700529 4825 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-9rmtb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.700571 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-9rmtb" podUID="3950d620-3e88-48fd-823a-f0ab8772ff5b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.711463 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7rf45" podStartSLOduration=137.711413838 podStartE2EDuration="2m17.711413838s" podCreationTimestamp="2025-10-07 19:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:48.710426417 +0000 UTC m=+157.532465054" watchObservedRunningTime="2025-10-07 19:02:48.711413838 +0000 UTC m=+157.533452475" Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.713667 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmvk6" Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.798742 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:48 crc kubenswrapper[4825]: E1007 19:02:48.800640 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:49.300625087 +0000 UTC m=+158.122663724 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:48 crc kubenswrapper[4825]: I1007 19:02:48.904135 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:48 crc kubenswrapper[4825]: E1007 19:02:48.905784 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:49.405772504 +0000 UTC m=+158.227811141 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.006754 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:49 crc kubenswrapper[4825]: E1007 19:02:49.007162 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:49.507146611 +0000 UTC m=+158.329185248 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.108759 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:49 crc kubenswrapper[4825]: E1007 19:02:49.109048 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:49.609035914 +0000 UTC m=+158.431074541 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.209780 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:49 crc kubenswrapper[4825]: E1007 19:02:49.209987 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:49.709956716 +0000 UTC m=+158.531995353 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.210080 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:49 crc kubenswrapper[4825]: E1007 19:02:49.210435 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:49.710421431 +0000 UTC m=+158.532460068 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.261594 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fww5f" Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.311704 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:49 crc kubenswrapper[4825]: E1007 19:02:49.311868 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:49.811845699 +0000 UTC m=+158.633884336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.311955 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:49 crc kubenswrapper[4825]: E1007 19:02:49.312416 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:49.812403207 +0000 UTC m=+158.634441844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.376887 4825 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.416772 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:49 crc kubenswrapper[4825]: E1007 19:02:49.417127 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:49.91711261 +0000 UTC m=+158.739151247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.518506 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:49 crc kubenswrapper[4825]: E1007 19:02:49.519085 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:50.019068816 +0000 UTC m=+158.841107453 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.524600 4825 patch_prober.go:28] interesting pod/router-default-5444994796-jfp2b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 19:02:49 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Oct 07 19:02:49 crc kubenswrapper[4825]: [+]process-running ok Oct 07 19:02:49 crc kubenswrapper[4825]: healthz check failed Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.524659 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jfp2b" podUID="f870c556-621f-4517-b1df-4e528a96f44f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.619756 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:49 crc kubenswrapper[4825]: E1007 19:02:49.620081 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:50.12006683 +0000 UTC m=+158.942105467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.705359 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" event={"ID":"427677ce-ec80-45d0-adff-6ca227d075be","Type":"ContainerStarted","Data":"a3f016e868c80d38801b494f3aedca1c716620d96beda00300332f33b98f56dd"} Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.705416 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" event={"ID":"427677ce-ec80-45d0-adff-6ca227d075be","Type":"ContainerStarted","Data":"c0693e08b734c943949f6527f24b8faa1e280355f9621b3fec6ba8c68cd713ca"} Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.721288 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:49 crc kubenswrapper[4825]: E1007 19:02:49.721736 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:50.221718425 +0000 UTC m=+159.043757062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.789580 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nww4f"] Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.790518 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nww4f" Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.792264 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.804528 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nww4f"] Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.822307 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:49 crc kubenswrapper[4825]: E1007 19:02:49.823157 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 19:02:50.323128713 +0000 UTC m=+159.145167350 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.924369 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37844b25-13d2-4bd7-8807-35c4bc1a4dde-utilities\") pod \"community-operators-nww4f\" (UID: \"37844b25-13d2-4bd7-8807-35c4bc1a4dde\") " pod="openshift-marketplace/community-operators-nww4f" Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.924416 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37844b25-13d2-4bd7-8807-35c4bc1a4dde-catalog-content\") pod \"community-operators-nww4f\" (UID: \"37844b25-13d2-4bd7-8807-35c4bc1a4dde\") " pod="openshift-marketplace/community-operators-nww4f" Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.924437 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwttq\" (UniqueName: \"kubernetes.io/projected/37844b25-13d2-4bd7-8807-35c4bc1a4dde-kube-api-access-kwttq\") pod \"community-operators-nww4f\" (UID: \"37844b25-13d2-4bd7-8807-35c4bc1a4dde\") " pod="openshift-marketplace/community-operators-nww4f" Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.924480 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:49 crc kubenswrapper[4825]: E1007 19:02:49.924752 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 19:02:50.424740988 +0000 UTC m=+159.246779625 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r2xb4" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.974859 4825 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-07T19:02:49.376919546Z","Handler":null,"Name":""} Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.977322 4825 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.977351 4825 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.986275 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5shkc"] Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.987151 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5shkc" Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.989000 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 07 19:02:49 crc kubenswrapper[4825]: I1007 19:02:49.998288 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5shkc"] Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.025637 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.025801 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37844b25-13d2-4bd7-8807-35c4bc1a4dde-utilities\") pod \"community-operators-nww4f\" (UID: \"37844b25-13d2-4bd7-8807-35c4bc1a4dde\") " pod="openshift-marketplace/community-operators-nww4f" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.025836 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37844b25-13d2-4bd7-8807-35c4bc1a4dde-catalog-content\") pod \"community-operators-nww4f\" (UID: \"37844b25-13d2-4bd7-8807-35c4bc1a4dde\") " pod="openshift-marketplace/community-operators-nww4f" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.025855 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwttq\" (UniqueName: \"kubernetes.io/projected/37844b25-13d2-4bd7-8807-35c4bc1a4dde-kube-api-access-kwttq\") pod \"community-operators-nww4f\" (UID: \"37844b25-13d2-4bd7-8807-35c4bc1a4dde\") " pod="openshift-marketplace/community-operators-nww4f" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.026664 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37844b25-13d2-4bd7-8807-35c4bc1a4dde-utilities\") pod \"community-operators-nww4f\" (UID: \"37844b25-13d2-4bd7-8807-35c4bc1a4dde\") " pod="openshift-marketplace/community-operators-nww4f" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.026880 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37844b25-13d2-4bd7-8807-35c4bc1a4dde-catalog-content\") pod \"community-operators-nww4f\" (UID: \"37844b25-13d2-4bd7-8807-35c4bc1a4dde\") " pod="openshift-marketplace/community-operators-nww4f" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.030957 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.049290 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwttq\" (UniqueName: \"kubernetes.io/projected/37844b25-13d2-4bd7-8807-35c4bc1a4dde-kube-api-access-kwttq\") pod \"community-operators-nww4f\" (UID: \"37844b25-13d2-4bd7-8807-35c4bc1a4dde\") " pod="openshift-marketplace/community-operators-nww4f" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.104728 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nww4f" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.127002 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43171f5c-ea7f-43d8-bdec-0d8f5b5c907c-utilities\") pod \"certified-operators-5shkc\" (UID: \"43171f5c-ea7f-43d8-bdec-0d8f5b5c907c\") " pod="openshift-marketplace/certified-operators-5shkc" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.127058 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vxmn\" (UniqueName: \"kubernetes.io/projected/43171f5c-ea7f-43d8-bdec-0d8f5b5c907c-kube-api-access-5vxmn\") pod \"certified-operators-5shkc\" (UID: \"43171f5c-ea7f-43d8-bdec-0d8f5b5c907c\") " pod="openshift-marketplace/certified-operators-5shkc" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.127127 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.127187 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43171f5c-ea7f-43d8-bdec-0d8f5b5c907c-catalog-content\") pod \"certified-operators-5shkc\" (UID: \"43171f5c-ea7f-43d8-bdec-0d8f5b5c907c\") " pod="openshift-marketplace/certified-operators-5shkc" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.130388 4825 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.130429 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.169345 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r2xb4\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.188145 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pkxnc"] Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.189282 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pkxnc" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.207599 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pkxnc"] Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.228573 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43171f5c-ea7f-43d8-bdec-0d8f5b5c907c-catalog-content\") pod \"certified-operators-5shkc\" (UID: \"43171f5c-ea7f-43d8-bdec-0d8f5b5c907c\") " pod="openshift-marketplace/certified-operators-5shkc" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.228694 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43171f5c-ea7f-43d8-bdec-0d8f5b5c907c-utilities\") pod \"certified-operators-5shkc\" (UID: \"43171f5c-ea7f-43d8-bdec-0d8f5b5c907c\") " pod="openshift-marketplace/certified-operators-5shkc" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.228727 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vxmn\" (UniqueName: \"kubernetes.io/projected/43171f5c-ea7f-43d8-bdec-0d8f5b5c907c-kube-api-access-5vxmn\") pod \"certified-operators-5shkc\" (UID: \"43171f5c-ea7f-43d8-bdec-0d8f5b5c907c\") " pod="openshift-marketplace/certified-operators-5shkc" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.229667 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43171f5c-ea7f-43d8-bdec-0d8f5b5c907c-catalog-content\") pod \"certified-operators-5shkc\" (UID: \"43171f5c-ea7f-43d8-bdec-0d8f5b5c907c\") " pod="openshift-marketplace/certified-operators-5shkc" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.229803 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43171f5c-ea7f-43d8-bdec-0d8f5b5c907c-utilities\") pod \"certified-operators-5shkc\" (UID: \"43171f5c-ea7f-43d8-bdec-0d8f5b5c907c\") " pod="openshift-marketplace/certified-operators-5shkc" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.251535 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vxmn\" (UniqueName: \"kubernetes.io/projected/43171f5c-ea7f-43d8-bdec-0d8f5b5c907c-kube-api-access-5vxmn\") pod \"certified-operators-5shkc\" (UID: \"43171f5c-ea7f-43d8-bdec-0d8f5b5c907c\") " pod="openshift-marketplace/certified-operators-5shkc" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.299915 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5shkc" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.330279 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb9tj\" (UniqueName: \"kubernetes.io/projected/6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb-kube-api-access-qb9tj\") pod \"community-operators-pkxnc\" (UID: \"6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb\") " pod="openshift-marketplace/community-operators-pkxnc" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.330364 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb-catalog-content\") pod \"community-operators-pkxnc\" (UID: \"6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb\") " pod="openshift-marketplace/community-operators-pkxnc" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.330499 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb-utilities\") pod \"community-operators-pkxnc\" (UID: \"6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb\") " pod="openshift-marketplace/community-operators-pkxnc" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.342891 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nww4f"] Oct 07 19:02:50 crc kubenswrapper[4825]: W1007 19:02:50.355733 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37844b25_13d2_4bd7_8807_35c4bc1a4dde.slice/crio-9956d237bd1f499c5d783b37416ae05ddb4ace77fb6bcceda4cb7d28ef21b9a5 WatchSource:0}: Error finding container 9956d237bd1f499c5d783b37416ae05ddb4ace77fb6bcceda4cb7d28ef21b9a5: Status 404 returned error can't find the container with id 9956d237bd1f499c5d783b37416ae05ddb4ace77fb6bcceda4cb7d28ef21b9a5 Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.362602 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.390882 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2k9wd"] Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.391899 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2k9wd" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.399192 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2k9wd"] Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.433948 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb-utilities\") pod \"community-operators-pkxnc\" (UID: \"6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb\") " pod="openshift-marketplace/community-operators-pkxnc" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.434067 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb9tj\" (UniqueName: \"kubernetes.io/projected/6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb-kube-api-access-qb9tj\") pod \"community-operators-pkxnc\" (UID: \"6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb\") " pod="openshift-marketplace/community-operators-pkxnc" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.434149 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb-catalog-content\") pod \"community-operators-pkxnc\" (UID: \"6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb\") " pod="openshift-marketplace/community-operators-pkxnc" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.434915 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb-catalog-content\") pod \"community-operators-pkxnc\" (UID: \"6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb\") " pod="openshift-marketplace/community-operators-pkxnc" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.435205 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb-utilities\") pod \"community-operators-pkxnc\" (UID: \"6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb\") " pod="openshift-marketplace/community-operators-pkxnc" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.459563 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb9tj\" (UniqueName: \"kubernetes.io/projected/6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb-kube-api-access-qb9tj\") pod \"community-operators-pkxnc\" (UID: \"6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb\") " pod="openshift-marketplace/community-operators-pkxnc" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.510382 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pkxnc" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.526798 4825 patch_prober.go:28] interesting pod/router-default-5444994796-jfp2b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 19:02:50 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Oct 07 19:02:50 crc kubenswrapper[4825]: [+]process-running ok Oct 07 19:02:50 crc kubenswrapper[4825]: healthz check failed Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.526851 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jfp2b" podUID="f870c556-621f-4517-b1df-4e528a96f44f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.536657 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ca5c5b3-f474-4650-9e03-c20849b4f03b-utilities\") pod \"certified-operators-2k9wd\" (UID: \"5ca5c5b3-f474-4650-9e03-c20849b4f03b\") " pod="openshift-marketplace/certified-operators-2k9wd" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.536698 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ca5c5b3-f474-4650-9e03-c20849b4f03b-catalog-content\") pod \"certified-operators-2k9wd\" (UID: \"5ca5c5b3-f474-4650-9e03-c20849b4f03b\") " pod="openshift-marketplace/certified-operators-2k9wd" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.536767 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p58bg\" (UniqueName: \"kubernetes.io/projected/5ca5c5b3-f474-4650-9e03-c20849b4f03b-kube-api-access-p58bg\") pod \"certified-operators-2k9wd\" (UID: \"5ca5c5b3-f474-4650-9e03-c20849b4f03b\") " pod="openshift-marketplace/certified-operators-2k9wd" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.551446 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5shkc"] Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.619825 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r2xb4"] Oct 07 19:02:50 crc kubenswrapper[4825]: W1007 19:02:50.625895 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4f51b57_041d_4009_9db3_3579fa7bb84c.slice/crio-e524be5ae2150708921b8671de2f5b9d4fc76f23ef54369db6830d7ef2628178 WatchSource:0}: Error finding container e524be5ae2150708921b8671de2f5b9d4fc76f23ef54369db6830d7ef2628178: Status 404 returned error can't find the container with id e524be5ae2150708921b8671de2f5b9d4fc76f23ef54369db6830d7ef2628178 Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.638107 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ca5c5b3-f474-4650-9e03-c20849b4f03b-utilities\") pod \"certified-operators-2k9wd\" (UID: \"5ca5c5b3-f474-4650-9e03-c20849b4f03b\") " pod="openshift-marketplace/certified-operators-2k9wd" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.638141 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ca5c5b3-f474-4650-9e03-c20849b4f03b-catalog-content\") pod \"certified-operators-2k9wd\" (UID: \"5ca5c5b3-f474-4650-9e03-c20849b4f03b\") " pod="openshift-marketplace/certified-operators-2k9wd" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.638182 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p58bg\" (UniqueName: \"kubernetes.io/projected/5ca5c5b3-f474-4650-9e03-c20849b4f03b-kube-api-access-p58bg\") pod \"certified-operators-2k9wd\" (UID: \"5ca5c5b3-f474-4650-9e03-c20849b4f03b\") " pod="openshift-marketplace/certified-operators-2k9wd" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.638817 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ca5c5b3-f474-4650-9e03-c20849b4f03b-utilities\") pod \"certified-operators-2k9wd\" (UID: \"5ca5c5b3-f474-4650-9e03-c20849b4f03b\") " pod="openshift-marketplace/certified-operators-2k9wd" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.639012 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ca5c5b3-f474-4650-9e03-c20849b4f03b-catalog-content\") pod \"certified-operators-2k9wd\" (UID: \"5ca5c5b3-f474-4650-9e03-c20849b4f03b\") " pod="openshift-marketplace/certified-operators-2k9wd" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.654339 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p58bg\" (UniqueName: \"kubernetes.io/projected/5ca5c5b3-f474-4650-9e03-c20849b4f03b-kube-api-access-p58bg\") pod \"certified-operators-2k9wd\" (UID: \"5ca5c5b3-f474-4650-9e03-c20849b4f03b\") " pod="openshift-marketplace/certified-operators-2k9wd" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.695152 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pkxnc"] Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.707704 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2k9wd" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.712611 4825 generic.go:334] "Generic (PLEG): container finished" podID="37844b25-13d2-4bd7-8807-35c4bc1a4dde" containerID="00bc5e686ee488f1c5f671f83bb1a68a97ce2799650ed4baf9ea3a79bccde179" exitCode=0 Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.712677 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nww4f" event={"ID":"37844b25-13d2-4bd7-8807-35c4bc1a4dde","Type":"ContainerDied","Data":"00bc5e686ee488f1c5f671f83bb1a68a97ce2799650ed4baf9ea3a79bccde179"} Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.712700 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nww4f" event={"ID":"37844b25-13d2-4bd7-8807-35c4bc1a4dde","Type":"ContainerStarted","Data":"9956d237bd1f499c5d783b37416ae05ddb4ace77fb6bcceda4cb7d28ef21b9a5"} Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.714311 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331060-f6rbn" event={"ID":"b305d341-f68e-40db-b37c-11660cdac447","Type":"ContainerDied","Data":"113f6d9686877bcacbab22951fcbeec4eb56b70baa2a93dc1c54271a1ca7e34f"} Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.714220 4825 generic.go:334] "Generic (PLEG): container finished" podID="b305d341-f68e-40db-b37c-11660cdac447" containerID="113f6d9686877bcacbab22951fcbeec4eb56b70baa2a93dc1c54271a1ca7e34f" exitCode=0 Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.715180 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.715633 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" event={"ID":"a4f51b57-041d-4009-9db3-3579fa7bb84c","Type":"ContainerStarted","Data":"e524be5ae2150708921b8671de2f5b9d4fc76f23ef54369db6830d7ef2628178"} Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.717066 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5shkc" event={"ID":"43171f5c-ea7f-43d8-bdec-0d8f5b5c907c","Type":"ContainerStarted","Data":"a019c6e70fccd0176f9118bb54f5fd852eb2c2a03bdaa7939f40f44d9e3bbbec"} Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.718310 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pkxnc" event={"ID":"6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb","Type":"ContainerStarted","Data":"7069b039c5aeabc4ab412b87d949f092042c175c3984b981092a7372de3b3a26"} Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.741670 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7rf45" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.820694 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-mnjlm" podStartSLOduration=10.820677273 podStartE2EDuration="10.820677273s" podCreationTimestamp="2025-10-07 19:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:50.818281187 +0000 UTC m=+159.640319824" watchObservedRunningTime="2025-10-07 19:02:50.820677273 +0000 UTC m=+159.642715910" Oct 07 19:02:50 crc kubenswrapper[4825]: I1007 19:02:50.976476 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2k9wd"] Oct 07 19:02:50 crc kubenswrapper[4825]: W1007 19:02:50.983837 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ca5c5b3_f474_4650_9e03_c20849b4f03b.slice/crio-3e00b503c003868de90a55bfb99af25f9c481f929ecc9d0cba910de21b1cd8d4 WatchSource:0}: Error finding container 3e00b503c003868de90a55bfb99af25f9c481f929ecc9d0cba910de21b1cd8d4: Status 404 returned error can't find the container with id 3e00b503c003868de90a55bfb99af25f9c481f929ecc9d0cba910de21b1cd8d4 Oct 07 19:02:51 crc kubenswrapper[4825]: I1007 19:02:51.525455 4825 patch_prober.go:28] interesting pod/router-default-5444994796-jfp2b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 19:02:51 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Oct 07 19:02:51 crc kubenswrapper[4825]: [+]process-running ok Oct 07 19:02:51 crc kubenswrapper[4825]: healthz check failed Oct 07 19:02:51 crc kubenswrapper[4825]: I1007 19:02:51.525547 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jfp2b" podUID="f870c556-621f-4517-b1df-4e528a96f44f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 19:02:51 crc kubenswrapper[4825]: I1007 19:02:51.726468 4825 generic.go:334] "Generic (PLEG): container finished" podID="43171f5c-ea7f-43d8-bdec-0d8f5b5c907c" containerID="62fd5e093fc1ce72e2e7adfc6ae46f46e0561f73dce2281386fa61dfcc29eb49" exitCode=0 Oct 07 19:02:51 crc kubenswrapper[4825]: I1007 19:02:51.726547 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5shkc" event={"ID":"43171f5c-ea7f-43d8-bdec-0d8f5b5c907c","Type":"ContainerDied","Data":"62fd5e093fc1ce72e2e7adfc6ae46f46e0561f73dce2281386fa61dfcc29eb49"} Oct 07 19:02:51 crc kubenswrapper[4825]: I1007 19:02:51.730728 4825 generic.go:334] "Generic (PLEG): container finished" podID="6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb" containerID="bb28670c35b83c949a34d6b57fe8e60db23ba1b2e546b565d030f23ad02e0d3a" exitCode=0 Oct 07 19:02:51 crc kubenswrapper[4825]: I1007 19:02:51.731209 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pkxnc" event={"ID":"6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb","Type":"ContainerDied","Data":"bb28670c35b83c949a34d6b57fe8e60db23ba1b2e546b565d030f23ad02e0d3a"} Oct 07 19:02:51 crc kubenswrapper[4825]: I1007 19:02:51.735802 4825 generic.go:334] "Generic (PLEG): container finished" podID="5ca5c5b3-f474-4650-9e03-c20849b4f03b" containerID="a746270b14f3a6a793b32395cee50990687647bee292663e37f5274f012ab882" exitCode=0 Oct 07 19:02:51 crc kubenswrapper[4825]: I1007 19:02:51.735877 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2k9wd" event={"ID":"5ca5c5b3-f474-4650-9e03-c20849b4f03b","Type":"ContainerDied","Data":"a746270b14f3a6a793b32395cee50990687647bee292663e37f5274f012ab882"} Oct 07 19:02:51 crc kubenswrapper[4825]: I1007 19:02:51.735904 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2k9wd" event={"ID":"5ca5c5b3-f474-4650-9e03-c20849b4f03b","Type":"ContainerStarted","Data":"3e00b503c003868de90a55bfb99af25f9c481f929ecc9d0cba910de21b1cd8d4"} Oct 07 19:02:51 crc kubenswrapper[4825]: I1007 19:02:51.737607 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" event={"ID":"a4f51b57-041d-4009-9db3-3579fa7bb84c","Type":"ContainerStarted","Data":"01405172ace623d6ebd13eb0dda25a61de3ea01f92c37b2c4ecf7539271affa2"} Oct 07 19:02:51 crc kubenswrapper[4825]: I1007 19:02:51.740335 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:02:51 crc kubenswrapper[4825]: I1007 19:02:51.767868 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" podStartSLOduration=139.767850684 podStartE2EDuration="2m19.767850684s" podCreationTimestamp="2025-10-07 19:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:51.765922653 +0000 UTC m=+160.587961290" watchObservedRunningTime="2025-10-07 19:02:51.767850684 +0000 UTC m=+160.589889321" Oct 07 19:02:51 crc kubenswrapper[4825]: I1007 19:02:51.807011 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 07 19:02:51 crc kubenswrapper[4825]: I1007 19:02:51.988823 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5gwml"] Oct 07 19:02:51 crc kubenswrapper[4825]: I1007 19:02:51.990055 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5gwml" Oct 07 19:02:51 crc kubenswrapper[4825]: I1007 19:02:51.992300 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.003186 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5gwml"] Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.005965 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331060-f6rbn" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.174553 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b305d341-f68e-40db-b37c-11660cdac447-config-volume\") pod \"b305d341-f68e-40db-b37c-11660cdac447\" (UID: \"b305d341-f68e-40db-b37c-11660cdac447\") " Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.174718 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b305d341-f68e-40db-b37c-11660cdac447-secret-volume\") pod \"b305d341-f68e-40db-b37c-11660cdac447\" (UID: \"b305d341-f68e-40db-b37c-11660cdac447\") " Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.174901 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj952\" (UniqueName: \"kubernetes.io/projected/b305d341-f68e-40db-b37c-11660cdac447-kube-api-access-jj952\") pod \"b305d341-f68e-40db-b37c-11660cdac447\" (UID: \"b305d341-f68e-40db-b37c-11660cdac447\") " Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.175161 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9407d2-7436-4a1e-82ef-babe5b4db5e9-utilities\") pod \"redhat-marketplace-5gwml\" (UID: \"fa9407d2-7436-4a1e-82ef-babe5b4db5e9\") " pod="openshift-marketplace/redhat-marketplace-5gwml" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.175364 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vsrp\" (UniqueName: \"kubernetes.io/projected/fa9407d2-7436-4a1e-82ef-babe5b4db5e9-kube-api-access-7vsrp\") pod \"redhat-marketplace-5gwml\" (UID: \"fa9407d2-7436-4a1e-82ef-babe5b4db5e9\") " pod="openshift-marketplace/redhat-marketplace-5gwml" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.175441 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9407d2-7436-4a1e-82ef-babe5b4db5e9-catalog-content\") pod \"redhat-marketplace-5gwml\" (UID: \"fa9407d2-7436-4a1e-82ef-babe5b4db5e9\") " pod="openshift-marketplace/redhat-marketplace-5gwml" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.175689 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b305d341-f68e-40db-b37c-11660cdac447-config-volume" (OuterVolumeSpecName: "config-volume") pod "b305d341-f68e-40db-b37c-11660cdac447" (UID: "b305d341-f68e-40db-b37c-11660cdac447"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.188658 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b305d341-f68e-40db-b37c-11660cdac447-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b305d341-f68e-40db-b37c-11660cdac447" (UID: "b305d341-f68e-40db-b37c-11660cdac447"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.188890 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b305d341-f68e-40db-b37c-11660cdac447-kube-api-access-jj952" (OuterVolumeSpecName: "kube-api-access-jj952") pod "b305d341-f68e-40db-b37c-11660cdac447" (UID: "b305d341-f68e-40db-b37c-11660cdac447"). InnerVolumeSpecName "kube-api-access-jj952". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.277628 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vsrp\" (UniqueName: \"kubernetes.io/projected/fa9407d2-7436-4a1e-82ef-babe5b4db5e9-kube-api-access-7vsrp\") pod \"redhat-marketplace-5gwml\" (UID: \"fa9407d2-7436-4a1e-82ef-babe5b4db5e9\") " pod="openshift-marketplace/redhat-marketplace-5gwml" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.277705 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9407d2-7436-4a1e-82ef-babe5b4db5e9-catalog-content\") pod \"redhat-marketplace-5gwml\" (UID: \"fa9407d2-7436-4a1e-82ef-babe5b4db5e9\") " pod="openshift-marketplace/redhat-marketplace-5gwml" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.277756 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9407d2-7436-4a1e-82ef-babe5b4db5e9-utilities\") pod \"redhat-marketplace-5gwml\" (UID: \"fa9407d2-7436-4a1e-82ef-babe5b4db5e9\") " pod="openshift-marketplace/redhat-marketplace-5gwml" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.277812 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj952\" (UniqueName: \"kubernetes.io/projected/b305d341-f68e-40db-b37c-11660cdac447-kube-api-access-jj952\") on node \"crc\" DevicePath \"\"" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.277827 4825 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b305d341-f68e-40db-b37c-11660cdac447-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.277840 4825 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b305d341-f68e-40db-b37c-11660cdac447-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.278279 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9407d2-7436-4a1e-82ef-babe5b4db5e9-utilities\") pod \"redhat-marketplace-5gwml\" (UID: \"fa9407d2-7436-4a1e-82ef-babe5b4db5e9\") " pod="openshift-marketplace/redhat-marketplace-5gwml" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.278321 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9407d2-7436-4a1e-82ef-babe5b4db5e9-catalog-content\") pod \"redhat-marketplace-5gwml\" (UID: \"fa9407d2-7436-4a1e-82ef-babe5b4db5e9\") " pod="openshift-marketplace/redhat-marketplace-5gwml" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.298829 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vsrp\" (UniqueName: \"kubernetes.io/projected/fa9407d2-7436-4a1e-82ef-babe5b4db5e9-kube-api-access-7vsrp\") pod \"redhat-marketplace-5gwml\" (UID: \"fa9407d2-7436-4a1e-82ef-babe5b4db5e9\") " pod="openshift-marketplace/redhat-marketplace-5gwml" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.324466 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5gwml" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.390319 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rdk5w"] Oct 07 19:02:52 crc kubenswrapper[4825]: E1007 19:02:52.390561 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b305d341-f68e-40db-b37c-11660cdac447" containerName="collect-profiles" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.390575 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b305d341-f68e-40db-b37c-11660cdac447" containerName="collect-profiles" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.390719 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b305d341-f68e-40db-b37c-11660cdac447" containerName="collect-profiles" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.391403 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rdk5w" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.405362 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rdk5w"] Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.454495 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.458145 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.459868 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.460048 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.475900 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.481387 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ea1416f-9faf-428d-bf84-9308f267669e-utilities\") pod \"redhat-marketplace-rdk5w\" (UID: \"7ea1416f-9faf-428d-bf84-9308f267669e\") " pod="openshift-marketplace/redhat-marketplace-rdk5w" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.481508 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ea1416f-9faf-428d-bf84-9308f267669e-catalog-content\") pod \"redhat-marketplace-rdk5w\" (UID: \"7ea1416f-9faf-428d-bf84-9308f267669e\") " pod="openshift-marketplace/redhat-marketplace-rdk5w" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.481531 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsn8x\" (UniqueName: \"kubernetes.io/projected/7ea1416f-9faf-428d-bf84-9308f267669e-kube-api-access-nsn8x\") pod \"redhat-marketplace-rdk5w\" (UID: \"7ea1416f-9faf-428d-bf84-9308f267669e\") " pod="openshift-marketplace/redhat-marketplace-rdk5w" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.525679 4825 patch_prober.go:28] interesting pod/router-default-5444994796-jfp2b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 19:02:52 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Oct 07 19:02:52 crc kubenswrapper[4825]: [+]process-running ok Oct 07 19:02:52 crc kubenswrapper[4825]: healthz check failed Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.525738 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jfp2b" podUID="f870c556-621f-4517-b1df-4e528a96f44f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.583111 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ea1416f-9faf-428d-bf84-9308f267669e-catalog-content\") pod \"redhat-marketplace-rdk5w\" (UID: \"7ea1416f-9faf-428d-bf84-9308f267669e\") " pod="openshift-marketplace/redhat-marketplace-rdk5w" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.583170 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsn8x\" (UniqueName: \"kubernetes.io/projected/7ea1416f-9faf-428d-bf84-9308f267669e-kube-api-access-nsn8x\") pod \"redhat-marketplace-rdk5w\" (UID: \"7ea1416f-9faf-428d-bf84-9308f267669e\") " pod="openshift-marketplace/redhat-marketplace-rdk5w" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.583218 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eaa5476c-840a-48a8-b6d5-6ced96751e74-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"eaa5476c-840a-48a8-b6d5-6ced96751e74\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.583284 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ea1416f-9faf-428d-bf84-9308f267669e-utilities\") pod \"redhat-marketplace-rdk5w\" (UID: \"7ea1416f-9faf-428d-bf84-9308f267669e\") " pod="openshift-marketplace/redhat-marketplace-rdk5w" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.583323 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eaa5476c-840a-48a8-b6d5-6ced96751e74-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"eaa5476c-840a-48a8-b6d5-6ced96751e74\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.584033 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ea1416f-9faf-428d-bf84-9308f267669e-catalog-content\") pod \"redhat-marketplace-rdk5w\" (UID: \"7ea1416f-9faf-428d-bf84-9308f267669e\") " pod="openshift-marketplace/redhat-marketplace-rdk5w" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.584285 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ea1416f-9faf-428d-bf84-9308f267669e-utilities\") pod \"redhat-marketplace-rdk5w\" (UID: \"7ea1416f-9faf-428d-bf84-9308f267669e\") " pod="openshift-marketplace/redhat-marketplace-rdk5w" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.598574 4825 patch_prober.go:28] interesting pod/downloads-7954f5f757-hpckv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.598614 4825 patch_prober.go:28] interesting pod/downloads-7954f5f757-hpckv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.598635 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hpckv" podUID="f9005f09-0f66-4541-8cb0-725ba2f4380d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.598669 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hpckv" podUID="f9005f09-0f66-4541-8cb0-725ba2f4380d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.603297 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsn8x\" (UniqueName: \"kubernetes.io/projected/7ea1416f-9faf-428d-bf84-9308f267669e-kube-api-access-nsn8x\") pod \"redhat-marketplace-rdk5w\" (UID: \"7ea1416f-9faf-428d-bf84-9308f267669e\") " pod="openshift-marketplace/redhat-marketplace-rdk5w" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.630754 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5gwml"] Oct 07 19:02:52 crc kubenswrapper[4825]: W1007 19:02:52.657210 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa9407d2_7436_4a1e_82ef_babe5b4db5e9.slice/crio-6592aaa1b84a14913888ae75e25064de5b9fc43701ea1c96f99b8008f87dc92a WatchSource:0}: Error finding container 6592aaa1b84a14913888ae75e25064de5b9fc43701ea1c96f99b8008f87dc92a: Status 404 returned error can't find the container with id 6592aaa1b84a14913888ae75e25064de5b9fc43701ea1c96f99b8008f87dc92a Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.685012 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eaa5476c-840a-48a8-b6d5-6ced96751e74-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"eaa5476c-840a-48a8-b6d5-6ced96751e74\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.685103 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eaa5476c-840a-48a8-b6d5-6ced96751e74-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"eaa5476c-840a-48a8-b6d5-6ced96751e74\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.685150 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eaa5476c-840a-48a8-b6d5-6ced96751e74-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"eaa5476c-840a-48a8-b6d5-6ced96751e74\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.700655 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eaa5476c-840a-48a8-b6d5-6ced96751e74-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"eaa5476c-840a-48a8-b6d5-6ced96751e74\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.720001 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rdk5w" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.747331 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5gwml" event={"ID":"fa9407d2-7436-4a1e-82ef-babe5b4db5e9","Type":"ContainerStarted","Data":"6592aaa1b84a14913888ae75e25064de5b9fc43701ea1c96f99b8008f87dc92a"} Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.752922 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331060-f6rbn" event={"ID":"b305d341-f68e-40db-b37c-11660cdac447","Type":"ContainerDied","Data":"2a23e2ebbe7ba015cf7e5c0355b42f349bc8f63f31eeb99a0ca383652cc7df2c"} Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.752968 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331060-f6rbn" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.752976 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a23e2ebbe7ba015cf7e5c0355b42f349bc8f63f31eeb99a0ca383652cc7df2c" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.775335 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.780698 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-rp8vt" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.784825 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.807767 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.816801 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dh96g" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.862415 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.863601 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.901188 4825 patch_prober.go:28] interesting pod/console-f9d7485db-sqfnk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.901284 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-sqfnk" podUID="21bd5368-2631-4c6c-94cf-d6e64b1dd657" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.994034 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x8pgw"] Oct 07 19:02:52 crc kubenswrapper[4825]: I1007 19:02:52.995444 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x8pgw" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.001576 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.003672 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x8pgw"] Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.109582 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d980b20c-41bd-4aff-9a22-2e806ce8d5cf-utilities\") pod \"redhat-operators-x8pgw\" (UID: \"d980b20c-41bd-4aff-9a22-2e806ce8d5cf\") " pod="openshift-marketplace/redhat-operators-x8pgw" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.109656 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x4gm\" (UniqueName: \"kubernetes.io/projected/d980b20c-41bd-4aff-9a22-2e806ce8d5cf-kube-api-access-2x4gm\") pod \"redhat-operators-x8pgw\" (UID: \"d980b20c-41bd-4aff-9a22-2e806ce8d5cf\") " pod="openshift-marketplace/redhat-operators-x8pgw" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.109703 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d980b20c-41bd-4aff-9a22-2e806ce8d5cf-catalog-content\") pod \"redhat-operators-x8pgw\" (UID: \"d980b20c-41bd-4aff-9a22-2e806ce8d5cf\") " pod="openshift-marketplace/redhat-operators-x8pgw" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.121269 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rdk5w"] Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.181694 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 07 19:02:53 crc kubenswrapper[4825]: W1007 19:02:53.194088 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podeaa5476c_840a_48a8_b6d5_6ced96751e74.slice/crio-7f5226326a77af5182451d4d111c37c25b2d0e3717580da33b5eeae0a3849243 WatchSource:0}: Error finding container 7f5226326a77af5182451d4d111c37c25b2d0e3717580da33b5eeae0a3849243: Status 404 returned error can't find the container with id 7f5226326a77af5182451d4d111c37c25b2d0e3717580da33b5eeae0a3849243 Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.210902 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d980b20c-41bd-4aff-9a22-2e806ce8d5cf-utilities\") pod \"redhat-operators-x8pgw\" (UID: \"d980b20c-41bd-4aff-9a22-2e806ce8d5cf\") " pod="openshift-marketplace/redhat-operators-x8pgw" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.210951 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x4gm\" (UniqueName: \"kubernetes.io/projected/d980b20c-41bd-4aff-9a22-2e806ce8d5cf-kube-api-access-2x4gm\") pod \"redhat-operators-x8pgw\" (UID: \"d980b20c-41bd-4aff-9a22-2e806ce8d5cf\") " pod="openshift-marketplace/redhat-operators-x8pgw" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.210983 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d980b20c-41bd-4aff-9a22-2e806ce8d5cf-catalog-content\") pod \"redhat-operators-x8pgw\" (UID: \"d980b20c-41bd-4aff-9a22-2e806ce8d5cf\") " pod="openshift-marketplace/redhat-operators-x8pgw" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.211419 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d980b20c-41bd-4aff-9a22-2e806ce8d5cf-catalog-content\") pod \"redhat-operators-x8pgw\" (UID: \"d980b20c-41bd-4aff-9a22-2e806ce8d5cf\") " pod="openshift-marketplace/redhat-operators-x8pgw" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.211635 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d980b20c-41bd-4aff-9a22-2e806ce8d5cf-utilities\") pod \"redhat-operators-x8pgw\" (UID: \"d980b20c-41bd-4aff-9a22-2e806ce8d5cf\") " pod="openshift-marketplace/redhat-operators-x8pgw" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.231803 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x4gm\" (UniqueName: \"kubernetes.io/projected/d980b20c-41bd-4aff-9a22-2e806ce8d5cf-kube-api-access-2x4gm\") pod \"redhat-operators-x8pgw\" (UID: \"d980b20c-41bd-4aff-9a22-2e806ce8d5cf\") " pod="openshift-marketplace/redhat-operators-x8pgw" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.328221 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x8pgw" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.391212 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zgk7n"] Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.393375 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zgk7n" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.400976 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zgk7n"] Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.425374 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a7c36e3-249c-4ad2-ab08-5b00b3f71218-utilities\") pod \"redhat-operators-zgk7n\" (UID: \"2a7c36e3-249c-4ad2-ab08-5b00b3f71218\") " pod="openshift-marketplace/redhat-operators-zgk7n" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.425422 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a7c36e3-249c-4ad2-ab08-5b00b3f71218-catalog-content\") pod \"redhat-operators-zgk7n\" (UID: \"2a7c36e3-249c-4ad2-ab08-5b00b3f71218\") " pod="openshift-marketplace/redhat-operators-zgk7n" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.425451 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qx4l\" (UniqueName: \"kubernetes.io/projected/2a7c36e3-249c-4ad2-ab08-5b00b3f71218-kube-api-access-7qx4l\") pod \"redhat-operators-zgk7n\" (UID: \"2a7c36e3-249c-4ad2-ab08-5b00b3f71218\") " pod="openshift-marketplace/redhat-operators-zgk7n" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.471805 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.474982 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.485727 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.486090 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.490002 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.522312 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-jfp2b" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.526431 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a7c36e3-249c-4ad2-ab08-5b00b3f71218-utilities\") pod \"redhat-operators-zgk7n\" (UID: \"2a7c36e3-249c-4ad2-ab08-5b00b3f71218\") " pod="openshift-marketplace/redhat-operators-zgk7n" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.526498 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a7c36e3-249c-4ad2-ab08-5b00b3f71218-catalog-content\") pod \"redhat-operators-zgk7n\" (UID: \"2a7c36e3-249c-4ad2-ab08-5b00b3f71218\") " pod="openshift-marketplace/redhat-operators-zgk7n" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.526532 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qx4l\" (UniqueName: \"kubernetes.io/projected/2a7c36e3-249c-4ad2-ab08-5b00b3f71218-kube-api-access-7qx4l\") pod \"redhat-operators-zgk7n\" (UID: \"2a7c36e3-249c-4ad2-ab08-5b00b3f71218\") " pod="openshift-marketplace/redhat-operators-zgk7n" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.528150 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a7c36e3-249c-4ad2-ab08-5b00b3f71218-catalog-content\") pod \"redhat-operators-zgk7n\" (UID: \"2a7c36e3-249c-4ad2-ab08-5b00b3f71218\") " pod="openshift-marketplace/redhat-operators-zgk7n" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.528151 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a7c36e3-249c-4ad2-ab08-5b00b3f71218-utilities\") pod \"redhat-operators-zgk7n\" (UID: \"2a7c36e3-249c-4ad2-ab08-5b00b3f71218\") " pod="openshift-marketplace/redhat-operators-zgk7n" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.531838 4825 patch_prober.go:28] interesting pod/router-default-5444994796-jfp2b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 19:02:53 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Oct 07 19:02:53 crc kubenswrapper[4825]: [+]process-running ok Oct 07 19:02:53 crc kubenswrapper[4825]: healthz check failed Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.532061 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jfp2b" podUID="f870c556-621f-4517-b1df-4e528a96f44f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.560489 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qx4l\" (UniqueName: \"kubernetes.io/projected/2a7c36e3-249c-4ad2-ab08-5b00b3f71218-kube-api-access-7qx4l\") pod \"redhat-operators-zgk7n\" (UID: \"2a7c36e3-249c-4ad2-ab08-5b00b3f71218\") " pod="openshift-marketplace/redhat-operators-zgk7n" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.585600 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-9rmtb" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.627712 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ea549f0-6076-4cd3-a3d5-fa5274a8b37f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4ea549f0-6076-4cd3-a3d5-fa5274a8b37f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.627822 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ea549f0-6076-4cd3-a3d5-fa5274a8b37f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4ea549f0-6076-4cd3-a3d5-fa5274a8b37f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.728984 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ea549f0-6076-4cd3-a3d5-fa5274a8b37f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4ea549f0-6076-4cd3-a3d5-fa5274a8b37f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.729103 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ea549f0-6076-4cd3-a3d5-fa5274a8b37f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4ea549f0-6076-4cd3-a3d5-fa5274a8b37f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.729198 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ea549f0-6076-4cd3-a3d5-fa5274a8b37f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4ea549f0-6076-4cd3-a3d5-fa5274a8b37f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.730348 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zgk7n" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.755395 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ea549f0-6076-4cd3-a3d5-fa5274a8b37f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4ea549f0-6076-4cd3-a3d5-fa5274a8b37f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.780896 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x8pgw"] Oct 07 19:02:53 crc kubenswrapper[4825]: W1007 19:02:53.801155 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd980b20c_41bd_4aff_9a22_2e806ce8d5cf.slice/crio-b6dee80d25b6cbaf4b6435f1618b1a1c1820f72ae85cd3caab89a4c3aa8f59f9 WatchSource:0}: Error finding container b6dee80d25b6cbaf4b6435f1618b1a1c1820f72ae85cd3caab89a4c3aa8f59f9: Status 404 returned error can't find the container with id b6dee80d25b6cbaf4b6435f1618b1a1c1820f72ae85cd3caab89a4c3aa8f59f9 Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.804839 4825 generic.go:334] "Generic (PLEG): container finished" podID="7ea1416f-9faf-428d-bf84-9308f267669e" containerID="c53d18abb17e375e74d0aa47a8f09968caac70ae15b785a5480243beacb9c83b" exitCode=0 Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.806668 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdk5w" event={"ID":"7ea1416f-9faf-428d-bf84-9308f267669e","Type":"ContainerDied","Data":"c53d18abb17e375e74d0aa47a8f09968caac70ae15b785a5480243beacb9c83b"} Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.806720 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdk5w" event={"ID":"7ea1416f-9faf-428d-bf84-9308f267669e","Type":"ContainerStarted","Data":"7b4965579a169c45fee7da22bf38a7ba7b69be65633e0e92f49fbb0bd917909c"} Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.811919 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"eaa5476c-840a-48a8-b6d5-6ced96751e74","Type":"ContainerStarted","Data":"cc85df94e13c3e7e073b698491c0e7d910bd8d2331f139a057ca999607f76bc5"} Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.812105 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"eaa5476c-840a-48a8-b6d5-6ced96751e74","Type":"ContainerStarted","Data":"7f5226326a77af5182451d4d111c37c25b2d0e3717580da33b5eeae0a3849243"} Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.815059 4825 generic.go:334] "Generic (PLEG): container finished" podID="fa9407d2-7436-4a1e-82ef-babe5b4db5e9" containerID="105f5dd179dee36d5c929d2e113fc45bde45d16b7c9d39f94665e86530e7e71a" exitCode=0 Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.815273 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5gwml" event={"ID":"fa9407d2-7436-4a1e-82ef-babe5b4db5e9","Type":"ContainerDied","Data":"105f5dd179dee36d5c929d2e113fc45bde45d16b7c9d39f94665e86530e7e71a"} Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.816868 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 19:02:53 crc kubenswrapper[4825]: I1007 19:02:53.861951 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.8619334140000001 podStartE2EDuration="1.861933414s" podCreationTimestamp="2025-10-07 19:02:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:53.856805851 +0000 UTC m=+162.678844488" watchObservedRunningTime="2025-10-07 19:02:53.861933414 +0000 UTC m=+162.683972051" Oct 07 19:02:54 crc kubenswrapper[4825]: I1007 19:02:54.043278 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zgk7n"] Oct 07 19:02:54 crc kubenswrapper[4825]: W1007 19:02:54.144384 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a7c36e3_249c_4ad2_ab08_5b00b3f71218.slice/crio-77c2310de5bdc804406507410a9e0cf608fdc84585557c488917f899b7c50510 WatchSource:0}: Error finding container 77c2310de5bdc804406507410a9e0cf608fdc84585557c488917f899b7c50510: Status 404 returned error can't find the container with id 77c2310de5bdc804406507410a9e0cf608fdc84585557c488917f899b7c50510 Oct 07 19:02:54 crc kubenswrapper[4825]: I1007 19:02:54.201689 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 07 19:02:54 crc kubenswrapper[4825]: I1007 19:02:54.240881 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs\") pod \"network-metrics-daemon-bvwh2\" (UID: \"ee9b984f-baa3-429f-b929-3d61d5e204bc\") " pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:02:54 crc kubenswrapper[4825]: I1007 19:02:54.248669 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee9b984f-baa3-429f-b929-3d61d5e204bc-metrics-certs\") pod \"network-metrics-daemon-bvwh2\" (UID: \"ee9b984f-baa3-429f-b929-3d61d5e204bc\") " pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:02:54 crc kubenswrapper[4825]: I1007 19:02:54.416838 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvwh2" Oct 07 19:02:54 crc kubenswrapper[4825]: I1007 19:02:54.525392 4825 patch_prober.go:28] interesting pod/router-default-5444994796-jfp2b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 19:02:54 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Oct 07 19:02:54 crc kubenswrapper[4825]: [+]process-running ok Oct 07 19:02:54 crc kubenswrapper[4825]: healthz check failed Oct 07 19:02:54 crc kubenswrapper[4825]: I1007 19:02:54.525471 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jfp2b" podUID="f870c556-621f-4517-b1df-4e528a96f44f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 19:02:54 crc kubenswrapper[4825]: I1007 19:02:54.836079 4825 generic.go:334] "Generic (PLEG): container finished" podID="2a7c36e3-249c-4ad2-ab08-5b00b3f71218" containerID="b8f7509af75f7a99e2db43046239b7b3378a6cb2bc7326dcd9f651373a484ab4" exitCode=0 Oct 07 19:02:54 crc kubenswrapper[4825]: I1007 19:02:54.836383 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgk7n" event={"ID":"2a7c36e3-249c-4ad2-ab08-5b00b3f71218","Type":"ContainerDied","Data":"b8f7509af75f7a99e2db43046239b7b3378a6cb2bc7326dcd9f651373a484ab4"} Oct 07 19:02:54 crc kubenswrapper[4825]: I1007 19:02:54.836564 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgk7n" event={"ID":"2a7c36e3-249c-4ad2-ab08-5b00b3f71218","Type":"ContainerStarted","Data":"77c2310de5bdc804406507410a9e0cf608fdc84585557c488917f899b7c50510"} Oct 07 19:02:54 crc kubenswrapper[4825]: I1007 19:02:54.840696 4825 generic.go:334] "Generic (PLEG): container finished" podID="d980b20c-41bd-4aff-9a22-2e806ce8d5cf" containerID="3eff291320f447cca77bb31b8dce3a7922a54e4936e3feb80594e1fd9ec70f97" exitCode=0 Oct 07 19:02:54 crc kubenswrapper[4825]: I1007 19:02:54.841146 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x8pgw" event={"ID":"d980b20c-41bd-4aff-9a22-2e806ce8d5cf","Type":"ContainerDied","Data":"3eff291320f447cca77bb31b8dce3a7922a54e4936e3feb80594e1fd9ec70f97"} Oct 07 19:02:54 crc kubenswrapper[4825]: I1007 19:02:54.841188 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x8pgw" event={"ID":"d980b20c-41bd-4aff-9a22-2e806ce8d5cf","Type":"ContainerStarted","Data":"b6dee80d25b6cbaf4b6435f1618b1a1c1820f72ae85cd3caab89a4c3aa8f59f9"} Oct 07 19:02:54 crc kubenswrapper[4825]: I1007 19:02:54.847597 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4ea549f0-6076-4cd3-a3d5-fa5274a8b37f","Type":"ContainerStarted","Data":"489b435783243f28dac755206359fdcef714861544aa64a7606d553053999f7b"} Oct 07 19:02:54 crc kubenswrapper[4825]: I1007 19:02:54.852333 4825 generic.go:334] "Generic (PLEG): container finished" podID="eaa5476c-840a-48a8-b6d5-6ced96751e74" containerID="cc85df94e13c3e7e073b698491c0e7d910bd8d2331f139a057ca999607f76bc5" exitCode=0 Oct 07 19:02:54 crc kubenswrapper[4825]: I1007 19:02:54.852361 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"eaa5476c-840a-48a8-b6d5-6ced96751e74","Type":"ContainerDied","Data":"cc85df94e13c3e7e073b698491c0e7d910bd8d2331f139a057ca999607f76bc5"} Oct 07 19:02:54 crc kubenswrapper[4825]: I1007 19:02:54.990409 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bvwh2"] Oct 07 19:02:54 crc kubenswrapper[4825]: W1007 19:02:54.994519 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee9b984f_baa3_429f_b929_3d61d5e204bc.slice/crio-5b298bf099de6f6907cdbeb9ccc923426012b36b13f91fd28bb017c1cc0aff5a WatchSource:0}: Error finding container 5b298bf099de6f6907cdbeb9ccc923426012b36b13f91fd28bb017c1cc0aff5a: Status 404 returned error can't find the container with id 5b298bf099de6f6907cdbeb9ccc923426012b36b13f91fd28bb017c1cc0aff5a Oct 07 19:02:55 crc kubenswrapper[4825]: I1007 19:02:55.558047 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-jfp2b" Oct 07 19:02:55 crc kubenswrapper[4825]: I1007 19:02:55.561290 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-jfp2b" Oct 07 19:02:55 crc kubenswrapper[4825]: I1007 19:02:55.867658 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4ea549f0-6076-4cd3-a3d5-fa5274a8b37f","Type":"ContainerStarted","Data":"f00c5eddc2edda4cbc16918261035b41fe58ff0175ce5b48c558d1e97d3f717a"} Oct 07 19:02:55 crc kubenswrapper[4825]: I1007 19:02:55.875119 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bvwh2" event={"ID":"ee9b984f-baa3-429f-b929-3d61d5e204bc","Type":"ContainerStarted","Data":"5b298bf099de6f6907cdbeb9ccc923426012b36b13f91fd28bb017c1cc0aff5a"} Oct 07 19:02:55 crc kubenswrapper[4825]: I1007 19:02:55.893337 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.8933184929999998 podStartE2EDuration="2.893318493s" podCreationTimestamp="2025-10-07 19:02:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:55.891171535 +0000 UTC m=+164.713210172" watchObservedRunningTime="2025-10-07 19:02:55.893318493 +0000 UTC m=+164.715357130" Oct 07 19:02:56 crc kubenswrapper[4825]: I1007 19:02:56.133536 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 19:02:56 crc kubenswrapper[4825]: I1007 19:02:56.286133 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eaa5476c-840a-48a8-b6d5-6ced96751e74-kube-api-access\") pod \"eaa5476c-840a-48a8-b6d5-6ced96751e74\" (UID: \"eaa5476c-840a-48a8-b6d5-6ced96751e74\") " Oct 07 19:02:56 crc kubenswrapper[4825]: I1007 19:02:56.286277 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eaa5476c-840a-48a8-b6d5-6ced96751e74-kubelet-dir\") pod \"eaa5476c-840a-48a8-b6d5-6ced96751e74\" (UID: \"eaa5476c-840a-48a8-b6d5-6ced96751e74\") " Oct 07 19:02:56 crc kubenswrapper[4825]: I1007 19:02:56.286449 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eaa5476c-840a-48a8-b6d5-6ced96751e74-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "eaa5476c-840a-48a8-b6d5-6ced96751e74" (UID: "eaa5476c-840a-48a8-b6d5-6ced96751e74"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:02:56 crc kubenswrapper[4825]: I1007 19:02:56.292899 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaa5476c-840a-48a8-b6d5-6ced96751e74-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "eaa5476c-840a-48a8-b6d5-6ced96751e74" (UID: "eaa5476c-840a-48a8-b6d5-6ced96751e74"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:02:56 crc kubenswrapper[4825]: I1007 19:02:56.387664 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eaa5476c-840a-48a8-b6d5-6ced96751e74-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 19:02:56 crc kubenswrapper[4825]: I1007 19:02:56.387695 4825 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eaa5476c-840a-48a8-b6d5-6ced96751e74-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 07 19:02:56 crc kubenswrapper[4825]: I1007 19:02:56.891486 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bvwh2" event={"ID":"ee9b984f-baa3-429f-b929-3d61d5e204bc","Type":"ContainerStarted","Data":"d3c5ec981107036ebbee22ab6347840f0be3b4f79f56f9420ae51965ff822173"} Oct 07 19:02:56 crc kubenswrapper[4825]: I1007 19:02:56.891582 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bvwh2" event={"ID":"ee9b984f-baa3-429f-b929-3d61d5e204bc","Type":"ContainerStarted","Data":"5a08a118fd8f8ef670a9fca187dd7451934b0b27f12721d913ce5f1e41633f07"} Oct 07 19:02:56 crc kubenswrapper[4825]: I1007 19:02:56.900983 4825 generic.go:334] "Generic (PLEG): container finished" podID="4ea549f0-6076-4cd3-a3d5-fa5274a8b37f" containerID="f00c5eddc2edda4cbc16918261035b41fe58ff0175ce5b48c558d1e97d3f717a" exitCode=0 Oct 07 19:02:56 crc kubenswrapper[4825]: I1007 19:02:56.901090 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4ea549f0-6076-4cd3-a3d5-fa5274a8b37f","Type":"ContainerDied","Data":"f00c5eddc2edda4cbc16918261035b41fe58ff0175ce5b48c558d1e97d3f717a"} Oct 07 19:02:56 crc kubenswrapper[4825]: I1007 19:02:56.909091 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-bvwh2" podStartSLOduration=145.909074054 podStartE2EDuration="2m25.909074054s" podCreationTimestamp="2025-10-07 19:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:02:56.906396978 +0000 UTC m=+165.728435615" watchObservedRunningTime="2025-10-07 19:02:56.909074054 +0000 UTC m=+165.731112691" Oct 07 19:02:56 crc kubenswrapper[4825]: I1007 19:02:56.913572 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"eaa5476c-840a-48a8-b6d5-6ced96751e74","Type":"ContainerDied","Data":"7f5226326a77af5182451d4d111c37c25b2d0e3717580da33b5eeae0a3849243"} Oct 07 19:02:56 crc kubenswrapper[4825]: I1007 19:02:56.913616 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f5226326a77af5182451d4d111c37c25b2d0e3717580da33b5eeae0a3849243" Oct 07 19:02:56 crc kubenswrapper[4825]: I1007 19:02:56.913672 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 19:02:58 crc kubenswrapper[4825]: I1007 19:02:58.796143 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lmk59" Oct 07 19:02:59 crc kubenswrapper[4825]: I1007 19:02:59.932537 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4ea549f0-6076-4cd3-a3d5-fa5274a8b37f","Type":"ContainerDied","Data":"489b435783243f28dac755206359fdcef714861544aa64a7606d553053999f7b"} Oct 07 19:02:59 crc kubenswrapper[4825]: I1007 19:02:59.932793 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="489b435783243f28dac755206359fdcef714861544aa64a7606d553053999f7b" Oct 07 19:02:59 crc kubenswrapper[4825]: I1007 19:02:59.961818 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 19:03:00 crc kubenswrapper[4825]: I1007 19:03:00.103030 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ea549f0-6076-4cd3-a3d5-fa5274a8b37f-kubelet-dir\") pod \"4ea549f0-6076-4cd3-a3d5-fa5274a8b37f\" (UID: \"4ea549f0-6076-4cd3-a3d5-fa5274a8b37f\") " Oct 07 19:03:00 crc kubenswrapper[4825]: I1007 19:03:00.103147 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ea549f0-6076-4cd3-a3d5-fa5274a8b37f-kube-api-access\") pod \"4ea549f0-6076-4cd3-a3d5-fa5274a8b37f\" (UID: \"4ea549f0-6076-4cd3-a3d5-fa5274a8b37f\") " Oct 07 19:03:00 crc kubenswrapper[4825]: I1007 19:03:00.103395 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ea549f0-6076-4cd3-a3d5-fa5274a8b37f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4ea549f0-6076-4cd3-a3d5-fa5274a8b37f" (UID: "4ea549f0-6076-4cd3-a3d5-fa5274a8b37f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:03:00 crc kubenswrapper[4825]: I1007 19:03:00.104751 4825 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ea549f0-6076-4cd3-a3d5-fa5274a8b37f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 07 19:03:00 crc kubenswrapper[4825]: I1007 19:03:00.113618 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ea549f0-6076-4cd3-a3d5-fa5274a8b37f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4ea549f0-6076-4cd3-a3d5-fa5274a8b37f" (UID: "4ea549f0-6076-4cd3-a3d5-fa5274a8b37f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:03:00 crc kubenswrapper[4825]: I1007 19:03:00.206018 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ea549f0-6076-4cd3-a3d5-fa5274a8b37f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 19:03:00 crc kubenswrapper[4825]: I1007 19:03:00.941255 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 19:03:02 crc kubenswrapper[4825]: I1007 19:03:02.602991 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-hpckv" Oct 07 19:03:02 crc kubenswrapper[4825]: I1007 19:03:02.869436 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:03:02 crc kubenswrapper[4825]: I1007 19:03:02.873649 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:03:05 crc kubenswrapper[4825]: I1007 19:03:05.709367 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:03:05 crc kubenswrapper[4825]: I1007 19:03:05.709770 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:03:10 crc kubenswrapper[4825]: I1007 19:03:10.259970 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 19:03:10 crc kubenswrapper[4825]: I1007 19:03:10.368481 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:03:19 crc kubenswrapper[4825]: E1007 19:03:19.809001 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 07 19:03:19 crc kubenswrapper[4825]: E1007 19:03:19.809717 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kwttq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-nww4f_openshift-marketplace(37844b25-13d2-4bd7-8807-35c4bc1a4dde): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 19:03:19 crc kubenswrapper[4825]: E1007 19:03:19.810980 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-nww4f" podUID="37844b25-13d2-4bd7-8807-35c4bc1a4dde" Oct 07 19:03:20 crc kubenswrapper[4825]: E1007 19:03:20.487960 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-nww4f" podUID="37844b25-13d2-4bd7-8807-35c4bc1a4dde" Oct 07 19:03:23 crc kubenswrapper[4825]: E1007 19:03:23.189296 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 07 19:03:23 crc kubenswrapper[4825]: E1007 19:03:23.189534 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5vxmn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-5shkc_openshift-marketplace(43171f5c-ea7f-43d8-bdec-0d8f5b5c907c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 19:03:23 crc kubenswrapper[4825]: E1007 19:03:23.190885 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-5shkc" podUID="43171f5c-ea7f-43d8-bdec-0d8f5b5c907c" Oct 07 19:03:23 crc kubenswrapper[4825]: I1007 19:03:23.730374 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dx9x2" Oct 07 19:03:23 crc kubenswrapper[4825]: E1007 19:03:23.753038 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 07 19:03:23 crc kubenswrapper[4825]: E1007 19:03:23.753205 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qb9tj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-pkxnc_openshift-marketplace(6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 19:03:23 crc kubenswrapper[4825]: E1007 19:03:23.754909 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-pkxnc" podUID="6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb" Oct 07 19:03:26 crc kubenswrapper[4825]: E1007 19:03:26.496783 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-pkxnc" podUID="6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb" Oct 07 19:03:26 crc kubenswrapper[4825]: E1007 19:03:26.497182 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-5shkc" podUID="43171f5c-ea7f-43d8-bdec-0d8f5b5c907c" Oct 07 19:03:29 crc kubenswrapper[4825]: E1007 19:03:29.581570 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:afa798da1eea334bab3cb1e14451ff84f98d35b436cdc4b408b46e289e4e2bc2: Get \"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:afa798da1eea334bab3cb1e14451ff84f98d35b436cdc4b408b46e289e4e2bc2\": context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 07 19:03:29 crc kubenswrapper[4825]: E1007 19:03:29.582017 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7vsrp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-5gwml_openshift-marketplace(fa9407d2-7436-4a1e-82ef-babe5b4db5e9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:afa798da1eea334bab3cb1e14451ff84f98d35b436cdc4b408b46e289e4e2bc2: Get \"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:afa798da1eea334bab3cb1e14451ff84f98d35b436cdc4b408b46e289e4e2bc2\": context canceled" logger="UnhandledError" Oct 07 19:03:29 crc kubenswrapper[4825]: E1007 19:03:29.583342 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:afa798da1eea334bab3cb1e14451ff84f98d35b436cdc4b408b46e289e4e2bc2: Get \\\"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:afa798da1eea334bab3cb1e14451ff84f98d35b436cdc4b408b46e289e4e2bc2\\\": context canceled\"" pod="openshift-marketplace/redhat-marketplace-5gwml" podUID="fa9407d2-7436-4a1e-82ef-babe5b4db5e9" Oct 07 19:03:29 crc kubenswrapper[4825]: E1007 19:03:29.619881 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 07 19:03:29 crc kubenswrapper[4825]: E1007 19:03:29.620401 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7qx4l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zgk7n_openshift-marketplace(2a7c36e3-249c-4ad2-ab08-5b00b3f71218): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 19:03:29 crc kubenswrapper[4825]: E1007 19:03:29.621610 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-zgk7n" podUID="2a7c36e3-249c-4ad2-ab08-5b00b3f71218" Oct 07 19:03:30 crc kubenswrapper[4825]: E1007 19:03:30.270658 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zgk7n" podUID="2a7c36e3-249c-4ad2-ab08-5b00b3f71218" Oct 07 19:03:30 crc kubenswrapper[4825]: E1007 19:03:30.270676 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-5gwml" podUID="fa9407d2-7436-4a1e-82ef-babe5b4db5e9" Oct 07 19:03:31 crc kubenswrapper[4825]: I1007 19:03:31.143398 4825 generic.go:334] "Generic (PLEG): container finished" podID="d980b20c-41bd-4aff-9a22-2e806ce8d5cf" containerID="c8f2442a4f10a9fbb39a308dab018298a209283d40c1d50034232a9d91fa19f2" exitCode=0 Oct 07 19:03:31 crc kubenswrapper[4825]: I1007 19:03:31.143467 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x8pgw" event={"ID":"d980b20c-41bd-4aff-9a22-2e806ce8d5cf","Type":"ContainerDied","Data":"c8f2442a4f10a9fbb39a308dab018298a209283d40c1d50034232a9d91fa19f2"} Oct 07 19:03:31 crc kubenswrapper[4825]: I1007 19:03:31.148540 4825 generic.go:334] "Generic (PLEG): container finished" podID="5ca5c5b3-f474-4650-9e03-c20849b4f03b" containerID="d4a13d63b02b29e84d731f5ea381c2e0a2f2f2aead0dcefbd065c14b0a317ca3" exitCode=0 Oct 07 19:03:31 crc kubenswrapper[4825]: I1007 19:03:31.148599 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2k9wd" event={"ID":"5ca5c5b3-f474-4650-9e03-c20849b4f03b","Type":"ContainerDied","Data":"d4a13d63b02b29e84d731f5ea381c2e0a2f2f2aead0dcefbd065c14b0a317ca3"} Oct 07 19:03:31 crc kubenswrapper[4825]: I1007 19:03:31.151690 4825 generic.go:334] "Generic (PLEG): container finished" podID="7ea1416f-9faf-428d-bf84-9308f267669e" containerID="df0e52f710f6ddff17403588de26fa246bd35b33ad888461b53f255a370b232a" exitCode=0 Oct 07 19:03:31 crc kubenswrapper[4825]: I1007 19:03:31.151712 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdk5w" event={"ID":"7ea1416f-9faf-428d-bf84-9308f267669e","Type":"ContainerDied","Data":"df0e52f710f6ddff17403588de26fa246bd35b33ad888461b53f255a370b232a"} Oct 07 19:03:32 crc kubenswrapper[4825]: I1007 19:03:32.161305 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x8pgw" event={"ID":"d980b20c-41bd-4aff-9a22-2e806ce8d5cf","Type":"ContainerStarted","Data":"67a33d2ca611a516fca2429cec53b837faa2b4da3ff0a88fe332be7bcb10462c"} Oct 07 19:03:32 crc kubenswrapper[4825]: I1007 19:03:32.168865 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2k9wd" event={"ID":"5ca5c5b3-f474-4650-9e03-c20849b4f03b","Type":"ContainerStarted","Data":"84b85e890c84d7dffae6e0ef01d0cd172bd26e3773e887b45f4431daa24a1653"} Oct 07 19:03:32 crc kubenswrapper[4825]: I1007 19:03:32.173567 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdk5w" event={"ID":"7ea1416f-9faf-428d-bf84-9308f267669e","Type":"ContainerStarted","Data":"25c884a521c6cc277b3b8060e28c662c6bf6abcf5cd343748303557697136e99"} Oct 07 19:03:32 crc kubenswrapper[4825]: I1007 19:03:32.190655 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x8pgw" podStartSLOduration=3.14701836 podStartE2EDuration="40.190631124s" podCreationTimestamp="2025-10-07 19:02:52 +0000 UTC" firstStartedPulling="2025-10-07 19:02:54.84649185 +0000 UTC m=+163.668530487" lastFinishedPulling="2025-10-07 19:03:31.890104594 +0000 UTC m=+200.712143251" observedRunningTime="2025-10-07 19:03:32.185482018 +0000 UTC m=+201.007520665" watchObservedRunningTime="2025-10-07 19:03:32.190631124 +0000 UTC m=+201.012669801" Oct 07 19:03:32 crc kubenswrapper[4825]: I1007 19:03:32.210540 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rdk5w" podStartSLOduration=2.386760025 podStartE2EDuration="40.210517853s" podCreationTimestamp="2025-10-07 19:02:52 +0000 UTC" firstStartedPulling="2025-10-07 19:02:53.806523035 +0000 UTC m=+162.628561672" lastFinishedPulling="2025-10-07 19:03:31.630280833 +0000 UTC m=+200.452319500" observedRunningTime="2025-10-07 19:03:32.20946479 +0000 UTC m=+201.031503447" watchObservedRunningTime="2025-10-07 19:03:32.210517853 +0000 UTC m=+201.032556530" Oct 07 19:03:32 crc kubenswrapper[4825]: I1007 19:03:32.232487 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2k9wd" podStartSLOduration=2.3751849050000002 podStartE2EDuration="42.23246162s" podCreationTimestamp="2025-10-07 19:02:50 +0000 UTC" firstStartedPulling="2025-10-07 19:02:51.737153904 +0000 UTC m=+160.559192541" lastFinishedPulling="2025-10-07 19:03:31.594430579 +0000 UTC m=+200.416469256" observedRunningTime="2025-10-07 19:03:32.229598708 +0000 UTC m=+201.051637375" watchObservedRunningTime="2025-10-07 19:03:32.23246162 +0000 UTC m=+201.054500267" Oct 07 19:03:32 crc kubenswrapper[4825]: I1007 19:03:32.720111 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rdk5w" Oct 07 19:03:32 crc kubenswrapper[4825]: I1007 19:03:32.720153 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rdk5w" Oct 07 19:03:33 crc kubenswrapper[4825]: I1007 19:03:33.181841 4825 generic.go:334] "Generic (PLEG): container finished" podID="37844b25-13d2-4bd7-8807-35c4bc1a4dde" containerID="066e924a191f04a389e8db4d6617ba5de14dd7116153957f5b28913e6dcc29c2" exitCode=0 Oct 07 19:03:33 crc kubenswrapper[4825]: I1007 19:03:33.181909 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nww4f" event={"ID":"37844b25-13d2-4bd7-8807-35c4bc1a4dde","Type":"ContainerDied","Data":"066e924a191f04a389e8db4d6617ba5de14dd7116153957f5b28913e6dcc29c2"} Oct 07 19:03:33 crc kubenswrapper[4825]: I1007 19:03:33.329537 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x8pgw" Oct 07 19:03:33 crc kubenswrapper[4825]: I1007 19:03:33.329841 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x8pgw" Oct 07 19:03:33 crc kubenswrapper[4825]: I1007 19:03:33.848659 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-rdk5w" podUID="7ea1416f-9faf-428d-bf84-9308f267669e" containerName="registry-server" probeResult="failure" output=< Oct 07 19:03:33 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Oct 07 19:03:33 crc kubenswrapper[4825]: > Oct 07 19:03:34 crc kubenswrapper[4825]: I1007 19:03:34.196568 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nww4f" event={"ID":"37844b25-13d2-4bd7-8807-35c4bc1a4dde","Type":"ContainerStarted","Data":"8e6b9a760b362579156ba326f0e9601ca4c22772dd1973e1b91856bb327cf2d3"} Oct 07 19:03:34 crc kubenswrapper[4825]: I1007 19:03:34.227304 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nww4f" podStartSLOduration=2.3325649410000002 podStartE2EDuration="45.227283079s" podCreationTimestamp="2025-10-07 19:02:49 +0000 UTC" firstStartedPulling="2025-10-07 19:02:50.714946848 +0000 UTC m=+159.536985485" lastFinishedPulling="2025-10-07 19:03:33.609664986 +0000 UTC m=+202.431703623" observedRunningTime="2025-10-07 19:03:34.22390651 +0000 UTC m=+203.045945187" watchObservedRunningTime="2025-10-07 19:03:34.227283079 +0000 UTC m=+203.049321746" Oct 07 19:03:34 crc kubenswrapper[4825]: I1007 19:03:34.396729 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-x8pgw" podUID="d980b20c-41bd-4aff-9a22-2e806ce8d5cf" containerName="registry-server" probeResult="failure" output=< Oct 07 19:03:34 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Oct 07 19:03:34 crc kubenswrapper[4825]: > Oct 07 19:03:35 crc kubenswrapper[4825]: I1007 19:03:35.708374 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:03:35 crc kubenswrapper[4825]: I1007 19:03:35.708455 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:03:35 crc kubenswrapper[4825]: I1007 19:03:35.708524 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:03:35 crc kubenswrapper[4825]: I1007 19:03:35.709251 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954"} pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 19:03:35 crc kubenswrapper[4825]: I1007 19:03:35.709392 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" containerID="cri-o://e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954" gracePeriod=600 Oct 07 19:03:36 crc kubenswrapper[4825]: I1007 19:03:36.210813 4825 generic.go:334] "Generic (PLEG): container finished" podID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerID="e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954" exitCode=0 Oct 07 19:03:36 crc kubenswrapper[4825]: I1007 19:03:36.210894 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerDied","Data":"e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954"} Oct 07 19:03:36 crc kubenswrapper[4825]: I1007 19:03:36.211152 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerStarted","Data":"d59266bf242c50a2596b3ab7b505a4aa50801a6525e38f53609ceb79dca8838b"} Oct 07 19:03:40 crc kubenswrapper[4825]: I1007 19:03:40.105410 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nww4f" Oct 07 19:03:40 crc kubenswrapper[4825]: I1007 19:03:40.106094 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nww4f" Oct 07 19:03:40 crc kubenswrapper[4825]: I1007 19:03:40.216806 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nww4f" Oct 07 19:03:40 crc kubenswrapper[4825]: I1007 19:03:40.316554 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nww4f" Oct 07 19:03:40 crc kubenswrapper[4825]: I1007 19:03:40.708741 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2k9wd" Oct 07 19:03:40 crc kubenswrapper[4825]: I1007 19:03:40.709206 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2k9wd" Oct 07 19:03:40 crc kubenswrapper[4825]: I1007 19:03:40.770859 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2k9wd" Oct 07 19:03:41 crc kubenswrapper[4825]: I1007 19:03:41.312558 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2k9wd" Oct 07 19:03:42 crc kubenswrapper[4825]: I1007 19:03:42.796199 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rdk5w" Oct 07 19:03:42 crc kubenswrapper[4825]: I1007 19:03:42.862405 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rdk5w" Oct 07 19:03:42 crc kubenswrapper[4825]: I1007 19:03:42.867312 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2k9wd"] Oct 07 19:03:43 crc kubenswrapper[4825]: I1007 19:03:43.258214 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2k9wd" podUID="5ca5c5b3-f474-4650-9e03-c20849b4f03b" containerName="registry-server" containerID="cri-o://84b85e890c84d7dffae6e0ef01d0cd172bd26e3773e887b45f4431daa24a1653" gracePeriod=2 Oct 07 19:03:43 crc kubenswrapper[4825]: I1007 19:03:43.402103 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x8pgw" Oct 07 19:03:43 crc kubenswrapper[4825]: I1007 19:03:43.471674 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x8pgw" Oct 07 19:03:45 crc kubenswrapper[4825]: I1007 19:03:45.263636 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rdk5w"] Oct 07 19:03:45 crc kubenswrapper[4825]: I1007 19:03:45.264900 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rdk5w" podUID="7ea1416f-9faf-428d-bf84-9308f267669e" containerName="registry-server" containerID="cri-o://25c884a521c6cc277b3b8060e28c662c6bf6abcf5cd343748303557697136e99" gracePeriod=2 Oct 07 19:03:45 crc kubenswrapper[4825]: I1007 19:03:45.274778 4825 generic.go:334] "Generic (PLEG): container finished" podID="5ca5c5b3-f474-4650-9e03-c20849b4f03b" containerID="84b85e890c84d7dffae6e0ef01d0cd172bd26e3773e887b45f4431daa24a1653" exitCode=0 Oct 07 19:03:45 crc kubenswrapper[4825]: I1007 19:03:45.274822 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2k9wd" event={"ID":"5ca5c5b3-f474-4650-9e03-c20849b4f03b","Type":"ContainerDied","Data":"84b85e890c84d7dffae6e0ef01d0cd172bd26e3773e887b45f4431daa24a1653"} Oct 07 19:03:45 crc kubenswrapper[4825]: I1007 19:03:45.274852 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2k9wd" event={"ID":"5ca5c5b3-f474-4650-9e03-c20849b4f03b","Type":"ContainerDied","Data":"3e00b503c003868de90a55bfb99af25f9c481f929ecc9d0cba910de21b1cd8d4"} Oct 07 19:03:45 crc kubenswrapper[4825]: I1007 19:03:45.274864 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e00b503c003868de90a55bfb99af25f9c481f929ecc9d0cba910de21b1cd8d4" Oct 07 19:03:45 crc kubenswrapper[4825]: I1007 19:03:45.309133 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2k9wd" Oct 07 19:03:45 crc kubenswrapper[4825]: I1007 19:03:45.494539 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ca5c5b3-f474-4650-9e03-c20849b4f03b-utilities\") pod \"5ca5c5b3-f474-4650-9e03-c20849b4f03b\" (UID: \"5ca5c5b3-f474-4650-9e03-c20849b4f03b\") " Oct 07 19:03:45 crc kubenswrapper[4825]: I1007 19:03:45.494640 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p58bg\" (UniqueName: \"kubernetes.io/projected/5ca5c5b3-f474-4650-9e03-c20849b4f03b-kube-api-access-p58bg\") pod \"5ca5c5b3-f474-4650-9e03-c20849b4f03b\" (UID: \"5ca5c5b3-f474-4650-9e03-c20849b4f03b\") " Oct 07 19:03:45 crc kubenswrapper[4825]: I1007 19:03:45.494715 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ca5c5b3-f474-4650-9e03-c20849b4f03b-catalog-content\") pod \"5ca5c5b3-f474-4650-9e03-c20849b4f03b\" (UID: \"5ca5c5b3-f474-4650-9e03-c20849b4f03b\") " Oct 07 19:03:45 crc kubenswrapper[4825]: I1007 19:03:45.497025 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ca5c5b3-f474-4650-9e03-c20849b4f03b-utilities" (OuterVolumeSpecName: "utilities") pod "5ca5c5b3-f474-4650-9e03-c20849b4f03b" (UID: "5ca5c5b3-f474-4650-9e03-c20849b4f03b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:03:45 crc kubenswrapper[4825]: I1007 19:03:45.502992 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ca5c5b3-f474-4650-9e03-c20849b4f03b-kube-api-access-p58bg" (OuterVolumeSpecName: "kube-api-access-p58bg") pod "5ca5c5b3-f474-4650-9e03-c20849b4f03b" (UID: "5ca5c5b3-f474-4650-9e03-c20849b4f03b"). InnerVolumeSpecName "kube-api-access-p58bg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:03:45 crc kubenswrapper[4825]: I1007 19:03:45.597100 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ca5c5b3-f474-4650-9e03-c20849b4f03b-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:03:45 crc kubenswrapper[4825]: I1007 19:03:45.597164 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p58bg\" (UniqueName: \"kubernetes.io/projected/5ca5c5b3-f474-4650-9e03-c20849b4f03b-kube-api-access-p58bg\") on node \"crc\" DevicePath \"\"" Oct 07 19:03:46 crc kubenswrapper[4825]: I1007 19:03:46.048975 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ca5c5b3-f474-4650-9e03-c20849b4f03b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ca5c5b3-f474-4650-9e03-c20849b4f03b" (UID: "5ca5c5b3-f474-4650-9e03-c20849b4f03b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:03:46 crc kubenswrapper[4825]: I1007 19:03:46.105559 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ca5c5b3-f474-4650-9e03-c20849b4f03b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:03:46 crc kubenswrapper[4825]: I1007 19:03:46.281912 4825 generic.go:334] "Generic (PLEG): container finished" podID="7ea1416f-9faf-428d-bf84-9308f267669e" containerID="25c884a521c6cc277b3b8060e28c662c6bf6abcf5cd343748303557697136e99" exitCode=0 Oct 07 19:03:46 crc kubenswrapper[4825]: I1007 19:03:46.282021 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdk5w" event={"ID":"7ea1416f-9faf-428d-bf84-9308f267669e","Type":"ContainerDied","Data":"25c884a521c6cc277b3b8060e28c662c6bf6abcf5cd343748303557697136e99"} Oct 07 19:03:46 crc kubenswrapper[4825]: I1007 19:03:46.282056 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2k9wd" Oct 07 19:03:46 crc kubenswrapper[4825]: I1007 19:03:46.317303 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2k9wd"] Oct 07 19:03:46 crc kubenswrapper[4825]: I1007 19:03:46.321909 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2k9wd"] Oct 07 19:03:46 crc kubenswrapper[4825]: I1007 19:03:46.857291 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rdk5w" Oct 07 19:03:47 crc kubenswrapper[4825]: I1007 19:03:47.022708 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ea1416f-9faf-428d-bf84-9308f267669e-catalog-content\") pod \"7ea1416f-9faf-428d-bf84-9308f267669e\" (UID: \"7ea1416f-9faf-428d-bf84-9308f267669e\") " Oct 07 19:03:47 crc kubenswrapper[4825]: I1007 19:03:47.023112 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ea1416f-9faf-428d-bf84-9308f267669e-utilities\") pod \"7ea1416f-9faf-428d-bf84-9308f267669e\" (UID: \"7ea1416f-9faf-428d-bf84-9308f267669e\") " Oct 07 19:03:47 crc kubenswrapper[4825]: I1007 19:03:47.023198 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsn8x\" (UniqueName: \"kubernetes.io/projected/7ea1416f-9faf-428d-bf84-9308f267669e-kube-api-access-nsn8x\") pod \"7ea1416f-9faf-428d-bf84-9308f267669e\" (UID: \"7ea1416f-9faf-428d-bf84-9308f267669e\") " Oct 07 19:03:47 crc kubenswrapper[4825]: I1007 19:03:47.023820 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ea1416f-9faf-428d-bf84-9308f267669e-utilities" (OuterVolumeSpecName: "utilities") pod "7ea1416f-9faf-428d-bf84-9308f267669e" (UID: "7ea1416f-9faf-428d-bf84-9308f267669e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:03:47 crc kubenswrapper[4825]: I1007 19:03:47.033521 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ea1416f-9faf-428d-bf84-9308f267669e-kube-api-access-nsn8x" (OuterVolumeSpecName: "kube-api-access-nsn8x") pod "7ea1416f-9faf-428d-bf84-9308f267669e" (UID: "7ea1416f-9faf-428d-bf84-9308f267669e"). InnerVolumeSpecName "kube-api-access-nsn8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:03:47 crc kubenswrapper[4825]: I1007 19:03:47.048202 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ea1416f-9faf-428d-bf84-9308f267669e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ea1416f-9faf-428d-bf84-9308f267669e" (UID: "7ea1416f-9faf-428d-bf84-9308f267669e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:03:47 crc kubenswrapper[4825]: I1007 19:03:47.124519 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsn8x\" (UniqueName: \"kubernetes.io/projected/7ea1416f-9faf-428d-bf84-9308f267669e-kube-api-access-nsn8x\") on node \"crc\" DevicePath \"\"" Oct 07 19:03:47 crc kubenswrapper[4825]: I1007 19:03:47.124550 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ea1416f-9faf-428d-bf84-9308f267669e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:03:47 crc kubenswrapper[4825]: I1007 19:03:47.124564 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ea1416f-9faf-428d-bf84-9308f267669e-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:03:47 crc kubenswrapper[4825]: I1007 19:03:47.290052 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdk5w" event={"ID":"7ea1416f-9faf-428d-bf84-9308f267669e","Type":"ContainerDied","Data":"7b4965579a169c45fee7da22bf38a7ba7b69be65633e0e92f49fbb0bd917909c"} Oct 07 19:03:47 crc kubenswrapper[4825]: I1007 19:03:47.290136 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rdk5w" Oct 07 19:03:47 crc kubenswrapper[4825]: I1007 19:03:47.290163 4825 scope.go:117] "RemoveContainer" containerID="25c884a521c6cc277b3b8060e28c662c6bf6abcf5cd343748303557697136e99" Oct 07 19:03:47 crc kubenswrapper[4825]: I1007 19:03:47.328323 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rdk5w"] Oct 07 19:03:47 crc kubenswrapper[4825]: I1007 19:03:47.341017 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rdk5w"] Oct 07 19:03:47 crc kubenswrapper[4825]: I1007 19:03:47.802027 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ca5c5b3-f474-4650-9e03-c20849b4f03b" path="/var/lib/kubelet/pods/5ca5c5b3-f474-4650-9e03-c20849b4f03b/volumes" Oct 07 19:03:47 crc kubenswrapper[4825]: I1007 19:03:47.802939 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ea1416f-9faf-428d-bf84-9308f267669e" path="/var/lib/kubelet/pods/7ea1416f-9faf-428d-bf84-9308f267669e/volumes" Oct 07 19:03:50 crc kubenswrapper[4825]: I1007 19:03:50.517060 4825 scope.go:117] "RemoveContainer" containerID="df0e52f710f6ddff17403588de26fa246bd35b33ad888461b53f255a370b232a" Oct 07 19:03:53 crc kubenswrapper[4825]: I1007 19:03:53.021944 4825 scope.go:117] "RemoveContainer" containerID="c53d18abb17e375e74d0aa47a8f09968caac70ae15b785a5480243beacb9c83b" Oct 07 19:03:53 crc kubenswrapper[4825]: E1007 19:03:53.828572 4825 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a7c36e3_249c_4ad2_ab08_5b00b3f71218.slice/crio-2e930a30cdf24ed116af4abd7c0e712dacfffec967d9957251a6ebf33ffcfde9.scope\": RecentStats: unable to find data in memory cache]" Oct 07 19:03:54 crc kubenswrapper[4825]: I1007 19:03:54.390715 4825 generic.go:334] "Generic (PLEG): container finished" podID="43171f5c-ea7f-43d8-bdec-0d8f5b5c907c" containerID="ae8ddc3bf3aa7da78c3bfe329936d3903ece842324e51b7b2acfb7a75a1fd031" exitCode=0 Oct 07 19:03:54 crc kubenswrapper[4825]: I1007 19:03:54.390836 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5shkc" event={"ID":"43171f5c-ea7f-43d8-bdec-0d8f5b5c907c","Type":"ContainerDied","Data":"ae8ddc3bf3aa7da78c3bfe329936d3903ece842324e51b7b2acfb7a75a1fd031"} Oct 07 19:03:54 crc kubenswrapper[4825]: I1007 19:03:54.397714 4825 generic.go:334] "Generic (PLEG): container finished" podID="6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb" containerID="d56d7a250e0fb004358ec049aeb60647325dc5db2b356406dbb059e0bbb1010b" exitCode=0 Oct 07 19:03:54 crc kubenswrapper[4825]: I1007 19:03:54.397845 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pkxnc" event={"ID":"6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb","Type":"ContainerDied","Data":"d56d7a250e0fb004358ec049aeb60647325dc5db2b356406dbb059e0bbb1010b"} Oct 07 19:03:54 crc kubenswrapper[4825]: I1007 19:03:54.403817 4825 generic.go:334] "Generic (PLEG): container finished" podID="fa9407d2-7436-4a1e-82ef-babe5b4db5e9" containerID="f021f25ab41bdcd6000599d9d268545df67acbc3ed9ad87989426f560eb2e1a4" exitCode=0 Oct 07 19:03:54 crc kubenswrapper[4825]: I1007 19:03:54.404006 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5gwml" event={"ID":"fa9407d2-7436-4a1e-82ef-babe5b4db5e9","Type":"ContainerDied","Data":"f021f25ab41bdcd6000599d9d268545df67acbc3ed9ad87989426f560eb2e1a4"} Oct 07 19:03:54 crc kubenswrapper[4825]: I1007 19:03:54.412260 4825 generic.go:334] "Generic (PLEG): container finished" podID="2a7c36e3-249c-4ad2-ab08-5b00b3f71218" containerID="2e930a30cdf24ed116af4abd7c0e712dacfffec967d9957251a6ebf33ffcfde9" exitCode=0 Oct 07 19:03:54 crc kubenswrapper[4825]: I1007 19:03:54.412313 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgk7n" event={"ID":"2a7c36e3-249c-4ad2-ab08-5b00b3f71218","Type":"ContainerDied","Data":"2e930a30cdf24ed116af4abd7c0e712dacfffec967d9957251a6ebf33ffcfde9"} Oct 07 19:03:55 crc kubenswrapper[4825]: I1007 19:03:55.419288 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5shkc" event={"ID":"43171f5c-ea7f-43d8-bdec-0d8f5b5c907c","Type":"ContainerStarted","Data":"11150b0418c0c3eb143ab15458b16fe7d960c15a40224d0f143611fc36fa24bf"} Oct 07 19:03:55 crc kubenswrapper[4825]: I1007 19:03:55.421254 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pkxnc" event={"ID":"6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb","Type":"ContainerStarted","Data":"c0b74944dbc5e39f14b4343bb1be9ca595991e2c8c204694120b4cf5edfe8cde"} Oct 07 19:03:55 crc kubenswrapper[4825]: I1007 19:03:55.423302 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5gwml" event={"ID":"fa9407d2-7436-4a1e-82ef-babe5b4db5e9","Type":"ContainerStarted","Data":"7b90ce285a9f323dc10ce2c110cfe75a23531b69b4de2cab6cead28af13d7e06"} Oct 07 19:03:55 crc kubenswrapper[4825]: I1007 19:03:55.425309 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgk7n" event={"ID":"2a7c36e3-249c-4ad2-ab08-5b00b3f71218","Type":"ContainerStarted","Data":"512c8602bdba5898d9089a3a1c3bb5ec883d4127e51cb1112f8395a92a11392f"} Oct 07 19:03:55 crc kubenswrapper[4825]: I1007 19:03:55.450019 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5shkc" podStartSLOduration=3.134824443 podStartE2EDuration="1m6.450001206s" podCreationTimestamp="2025-10-07 19:02:49 +0000 UTC" firstStartedPulling="2025-10-07 19:02:51.728078455 +0000 UTC m=+160.550117092" lastFinishedPulling="2025-10-07 19:03:55.043255218 +0000 UTC m=+223.865293855" observedRunningTime="2025-10-07 19:03:55.446413661 +0000 UTC m=+224.268452298" watchObservedRunningTime="2025-10-07 19:03:55.450001206 +0000 UTC m=+224.272039853" Oct 07 19:03:55 crc kubenswrapper[4825]: I1007 19:03:55.463331 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zgk7n" podStartSLOduration=2.429004066 podStartE2EDuration="1m2.463309545s" podCreationTimestamp="2025-10-07 19:02:53 +0000 UTC" firstStartedPulling="2025-10-07 19:02:54.838374881 +0000 UTC m=+163.660413518" lastFinishedPulling="2025-10-07 19:03:54.87268036 +0000 UTC m=+223.694718997" observedRunningTime="2025-10-07 19:03:55.461034391 +0000 UTC m=+224.283073028" watchObservedRunningTime="2025-10-07 19:03:55.463309545 +0000 UTC m=+224.285348192" Oct 07 19:03:55 crc kubenswrapper[4825]: I1007 19:03:55.480758 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pkxnc" podStartSLOduration=2.325058393 podStartE2EDuration="1m5.480736945s" podCreationTimestamp="2025-10-07 19:02:50 +0000 UTC" firstStartedPulling="2025-10-07 19:02:51.732300509 +0000 UTC m=+160.554339146" lastFinishedPulling="2025-10-07 19:03:54.887979061 +0000 UTC m=+223.710017698" observedRunningTime="2025-10-07 19:03:55.478544005 +0000 UTC m=+224.300582642" watchObservedRunningTime="2025-10-07 19:03:55.480736945 +0000 UTC m=+224.302775582" Oct 07 19:03:55 crc kubenswrapper[4825]: I1007 19:03:55.500315 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5gwml" podStartSLOduration=3.534836908 podStartE2EDuration="1m4.500298674s" podCreationTimestamp="2025-10-07 19:02:51 +0000 UTC" firstStartedPulling="2025-10-07 19:02:53.853568847 +0000 UTC m=+162.675607484" lastFinishedPulling="2025-10-07 19:03:54.819030603 +0000 UTC m=+223.641069250" observedRunningTime="2025-10-07 19:03:55.498072043 +0000 UTC m=+224.320110670" watchObservedRunningTime="2025-10-07 19:03:55.500298674 +0000 UTC m=+224.322337311" Oct 07 19:04:00 crc kubenswrapper[4825]: I1007 19:04:00.300426 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5shkc" Oct 07 19:04:00 crc kubenswrapper[4825]: I1007 19:04:00.300926 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5shkc" Oct 07 19:04:00 crc kubenswrapper[4825]: I1007 19:04:00.381393 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5shkc" Oct 07 19:04:00 crc kubenswrapper[4825]: I1007 19:04:00.501815 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5shkc" Oct 07 19:04:00 crc kubenswrapper[4825]: I1007 19:04:00.510612 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pkxnc" Oct 07 19:04:00 crc kubenswrapper[4825]: I1007 19:04:00.510947 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pkxnc" Oct 07 19:04:00 crc kubenswrapper[4825]: I1007 19:04:00.564733 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pkxnc" Oct 07 19:04:01 crc kubenswrapper[4825]: I1007 19:04:01.515103 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pkxnc" Oct 07 19:04:02 crc kubenswrapper[4825]: I1007 19:04:02.147933 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-x74mv"] Oct 07 19:04:02 crc kubenswrapper[4825]: I1007 19:04:02.325448 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5gwml" Oct 07 19:04:02 crc kubenswrapper[4825]: I1007 19:04:02.325756 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5gwml" Oct 07 19:04:02 crc kubenswrapper[4825]: I1007 19:04:02.373557 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5gwml" Oct 07 19:04:02 crc kubenswrapper[4825]: I1007 19:04:02.506748 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5gwml" Oct 07 19:04:03 crc kubenswrapper[4825]: I1007 19:04:03.732535 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zgk7n" Oct 07 19:04:03 crc kubenswrapper[4825]: I1007 19:04:03.732569 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zgk7n" Oct 07 19:04:03 crc kubenswrapper[4825]: I1007 19:04:03.788116 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zgk7n" Oct 07 19:04:04 crc kubenswrapper[4825]: I1007 19:04:04.509269 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zgk7n" Oct 07 19:04:04 crc kubenswrapper[4825]: I1007 19:04:04.657624 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pkxnc"] Oct 07 19:04:04 crc kubenswrapper[4825]: I1007 19:04:04.658141 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pkxnc" podUID="6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb" containerName="registry-server" containerID="cri-o://c0b74944dbc5e39f14b4343bb1be9ca595991e2c8c204694120b4cf5edfe8cde" gracePeriod=2 Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.244504 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pkxnc" Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.377743 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb-utilities\") pod \"6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb\" (UID: \"6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb\") " Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.377819 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb9tj\" (UniqueName: \"kubernetes.io/projected/6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb-kube-api-access-qb9tj\") pod \"6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb\" (UID: \"6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb\") " Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.377865 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb-catalog-content\") pod \"6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb\" (UID: \"6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb\") " Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.378858 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb-utilities" (OuterVolumeSpecName: "utilities") pod "6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb" (UID: "6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.384389 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb-kube-api-access-qb9tj" (OuterVolumeSpecName: "kube-api-access-qb9tj") pod "6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb" (UID: "6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb"). InnerVolumeSpecName "kube-api-access-qb9tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.478477 4825 generic.go:334] "Generic (PLEG): container finished" podID="6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb" containerID="c0b74944dbc5e39f14b4343bb1be9ca595991e2c8c204694120b4cf5edfe8cde" exitCode=0 Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.478529 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pkxnc" event={"ID":"6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb","Type":"ContainerDied","Data":"c0b74944dbc5e39f14b4343bb1be9ca595991e2c8c204694120b4cf5edfe8cde"} Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.478573 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pkxnc" event={"ID":"6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb","Type":"ContainerDied","Data":"7069b039c5aeabc4ab412b87d949f092042c175c3984b981092a7372de3b3a26"} Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.478593 4825 scope.go:117] "RemoveContainer" containerID="c0b74944dbc5e39f14b4343bb1be9ca595991e2c8c204694120b4cf5edfe8cde" Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.478595 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pkxnc" Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.478963 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.479023 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb9tj\" (UniqueName: \"kubernetes.io/projected/6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb-kube-api-access-qb9tj\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.497925 4825 scope.go:117] "RemoveContainer" containerID="d56d7a250e0fb004358ec049aeb60647325dc5db2b356406dbb059e0bbb1010b" Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.521332 4825 scope.go:117] "RemoveContainer" containerID="bb28670c35b83c949a34d6b57fe8e60db23ba1b2e546b565d030f23ad02e0d3a" Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.550712 4825 scope.go:117] "RemoveContainer" containerID="c0b74944dbc5e39f14b4343bb1be9ca595991e2c8c204694120b4cf5edfe8cde" Oct 07 19:04:05 crc kubenswrapper[4825]: E1007 19:04:05.551123 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0b74944dbc5e39f14b4343bb1be9ca595991e2c8c204694120b4cf5edfe8cde\": container with ID starting with c0b74944dbc5e39f14b4343bb1be9ca595991e2c8c204694120b4cf5edfe8cde not found: ID does not exist" containerID="c0b74944dbc5e39f14b4343bb1be9ca595991e2c8c204694120b4cf5edfe8cde" Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.551155 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0b74944dbc5e39f14b4343bb1be9ca595991e2c8c204694120b4cf5edfe8cde"} err="failed to get container status \"c0b74944dbc5e39f14b4343bb1be9ca595991e2c8c204694120b4cf5edfe8cde\": rpc error: code = NotFound desc = could not find container \"c0b74944dbc5e39f14b4343bb1be9ca595991e2c8c204694120b4cf5edfe8cde\": container with ID starting with c0b74944dbc5e39f14b4343bb1be9ca595991e2c8c204694120b4cf5edfe8cde not found: ID does not exist" Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.551175 4825 scope.go:117] "RemoveContainer" containerID="d56d7a250e0fb004358ec049aeb60647325dc5db2b356406dbb059e0bbb1010b" Oct 07 19:04:05 crc kubenswrapper[4825]: E1007 19:04:05.551375 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d56d7a250e0fb004358ec049aeb60647325dc5db2b356406dbb059e0bbb1010b\": container with ID starting with d56d7a250e0fb004358ec049aeb60647325dc5db2b356406dbb059e0bbb1010b not found: ID does not exist" containerID="d56d7a250e0fb004358ec049aeb60647325dc5db2b356406dbb059e0bbb1010b" Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.551394 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d56d7a250e0fb004358ec049aeb60647325dc5db2b356406dbb059e0bbb1010b"} err="failed to get container status \"d56d7a250e0fb004358ec049aeb60647325dc5db2b356406dbb059e0bbb1010b\": rpc error: code = NotFound desc = could not find container \"d56d7a250e0fb004358ec049aeb60647325dc5db2b356406dbb059e0bbb1010b\": container with ID starting with d56d7a250e0fb004358ec049aeb60647325dc5db2b356406dbb059e0bbb1010b not found: ID does not exist" Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.551416 4825 scope.go:117] "RemoveContainer" containerID="bb28670c35b83c949a34d6b57fe8e60db23ba1b2e546b565d030f23ad02e0d3a" Oct 07 19:04:05 crc kubenswrapper[4825]: E1007 19:04:05.551636 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb28670c35b83c949a34d6b57fe8e60db23ba1b2e546b565d030f23ad02e0d3a\": container with ID starting with bb28670c35b83c949a34d6b57fe8e60db23ba1b2e546b565d030f23ad02e0d3a not found: ID does not exist" containerID="bb28670c35b83c949a34d6b57fe8e60db23ba1b2e546b565d030f23ad02e0d3a" Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.551651 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb28670c35b83c949a34d6b57fe8e60db23ba1b2e546b565d030f23ad02e0d3a"} err="failed to get container status \"bb28670c35b83c949a34d6b57fe8e60db23ba1b2e546b565d030f23ad02e0d3a\": rpc error: code = NotFound desc = could not find container \"bb28670c35b83c949a34d6b57fe8e60db23ba1b2e546b565d030f23ad02e0d3a\": container with ID starting with bb28670c35b83c949a34d6b57fe8e60db23ba1b2e546b565d030f23ad02e0d3a not found: ID does not exist" Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.558888 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb" (UID: "6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.580760 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.805546 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pkxnc"] Oct 07 19:04:05 crc kubenswrapper[4825]: I1007 19:04:05.810876 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pkxnc"] Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.057496 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zgk7n"] Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.057999 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zgk7n" podUID="2a7c36e3-249c-4ad2-ab08-5b00b3f71218" containerName="registry-server" containerID="cri-o://512c8602bdba5898d9089a3a1c3bb5ec883d4127e51cb1112f8395a92a11392f" gracePeriod=2 Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.435388 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zgk7n" Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.490307 4825 generic.go:334] "Generic (PLEG): container finished" podID="2a7c36e3-249c-4ad2-ab08-5b00b3f71218" containerID="512c8602bdba5898d9089a3a1c3bb5ec883d4127e51cb1112f8395a92a11392f" exitCode=0 Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.490357 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgk7n" event={"ID":"2a7c36e3-249c-4ad2-ab08-5b00b3f71218","Type":"ContainerDied","Data":"512c8602bdba5898d9089a3a1c3bb5ec883d4127e51cb1112f8395a92a11392f"} Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.490370 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zgk7n" Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.490388 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgk7n" event={"ID":"2a7c36e3-249c-4ad2-ab08-5b00b3f71218","Type":"ContainerDied","Data":"77c2310de5bdc804406507410a9e0cf608fdc84585557c488917f899b7c50510"} Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.490407 4825 scope.go:117] "RemoveContainer" containerID="512c8602bdba5898d9089a3a1c3bb5ec883d4127e51cb1112f8395a92a11392f" Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.507683 4825 scope.go:117] "RemoveContainer" containerID="2e930a30cdf24ed116af4abd7c0e712dacfffec967d9957251a6ebf33ffcfde9" Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.520216 4825 scope.go:117] "RemoveContainer" containerID="b8f7509af75f7a99e2db43046239b7b3378a6cb2bc7326dcd9f651373a484ab4" Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.541159 4825 scope.go:117] "RemoveContainer" containerID="512c8602bdba5898d9089a3a1c3bb5ec883d4127e51cb1112f8395a92a11392f" Oct 07 19:04:07 crc kubenswrapper[4825]: E1007 19:04:07.541610 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"512c8602bdba5898d9089a3a1c3bb5ec883d4127e51cb1112f8395a92a11392f\": container with ID starting with 512c8602bdba5898d9089a3a1c3bb5ec883d4127e51cb1112f8395a92a11392f not found: ID does not exist" containerID="512c8602bdba5898d9089a3a1c3bb5ec883d4127e51cb1112f8395a92a11392f" Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.541662 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"512c8602bdba5898d9089a3a1c3bb5ec883d4127e51cb1112f8395a92a11392f"} err="failed to get container status \"512c8602bdba5898d9089a3a1c3bb5ec883d4127e51cb1112f8395a92a11392f\": rpc error: code = NotFound desc = could not find container \"512c8602bdba5898d9089a3a1c3bb5ec883d4127e51cb1112f8395a92a11392f\": container with ID starting with 512c8602bdba5898d9089a3a1c3bb5ec883d4127e51cb1112f8395a92a11392f not found: ID does not exist" Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.541693 4825 scope.go:117] "RemoveContainer" containerID="2e930a30cdf24ed116af4abd7c0e712dacfffec967d9957251a6ebf33ffcfde9" Oct 07 19:04:07 crc kubenswrapper[4825]: E1007 19:04:07.542071 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e930a30cdf24ed116af4abd7c0e712dacfffec967d9957251a6ebf33ffcfde9\": container with ID starting with 2e930a30cdf24ed116af4abd7c0e712dacfffec967d9957251a6ebf33ffcfde9 not found: ID does not exist" containerID="2e930a30cdf24ed116af4abd7c0e712dacfffec967d9957251a6ebf33ffcfde9" Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.542104 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e930a30cdf24ed116af4abd7c0e712dacfffec967d9957251a6ebf33ffcfde9"} err="failed to get container status \"2e930a30cdf24ed116af4abd7c0e712dacfffec967d9957251a6ebf33ffcfde9\": rpc error: code = NotFound desc = could not find container \"2e930a30cdf24ed116af4abd7c0e712dacfffec967d9957251a6ebf33ffcfde9\": container with ID starting with 2e930a30cdf24ed116af4abd7c0e712dacfffec967d9957251a6ebf33ffcfde9 not found: ID does not exist" Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.542128 4825 scope.go:117] "RemoveContainer" containerID="b8f7509af75f7a99e2db43046239b7b3378a6cb2bc7326dcd9f651373a484ab4" Oct 07 19:04:07 crc kubenswrapper[4825]: E1007 19:04:07.542512 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8f7509af75f7a99e2db43046239b7b3378a6cb2bc7326dcd9f651373a484ab4\": container with ID starting with b8f7509af75f7a99e2db43046239b7b3378a6cb2bc7326dcd9f651373a484ab4 not found: ID does not exist" containerID="b8f7509af75f7a99e2db43046239b7b3378a6cb2bc7326dcd9f651373a484ab4" Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.542565 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8f7509af75f7a99e2db43046239b7b3378a6cb2bc7326dcd9f651373a484ab4"} err="failed to get container status \"b8f7509af75f7a99e2db43046239b7b3378a6cb2bc7326dcd9f651373a484ab4\": rpc error: code = NotFound desc = could not find container \"b8f7509af75f7a99e2db43046239b7b3378a6cb2bc7326dcd9f651373a484ab4\": container with ID starting with b8f7509af75f7a99e2db43046239b7b3378a6cb2bc7326dcd9f651373a484ab4 not found: ID does not exist" Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.605449 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a7c36e3-249c-4ad2-ab08-5b00b3f71218-catalog-content\") pod \"2a7c36e3-249c-4ad2-ab08-5b00b3f71218\" (UID: \"2a7c36e3-249c-4ad2-ab08-5b00b3f71218\") " Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.605591 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a7c36e3-249c-4ad2-ab08-5b00b3f71218-utilities\") pod \"2a7c36e3-249c-4ad2-ab08-5b00b3f71218\" (UID: \"2a7c36e3-249c-4ad2-ab08-5b00b3f71218\") " Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.605631 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qx4l\" (UniqueName: \"kubernetes.io/projected/2a7c36e3-249c-4ad2-ab08-5b00b3f71218-kube-api-access-7qx4l\") pod \"2a7c36e3-249c-4ad2-ab08-5b00b3f71218\" (UID: \"2a7c36e3-249c-4ad2-ab08-5b00b3f71218\") " Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.607372 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a7c36e3-249c-4ad2-ab08-5b00b3f71218-utilities" (OuterVolumeSpecName: "utilities") pod "2a7c36e3-249c-4ad2-ab08-5b00b3f71218" (UID: "2a7c36e3-249c-4ad2-ab08-5b00b3f71218"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.612398 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a7c36e3-249c-4ad2-ab08-5b00b3f71218-kube-api-access-7qx4l" (OuterVolumeSpecName: "kube-api-access-7qx4l") pod "2a7c36e3-249c-4ad2-ab08-5b00b3f71218" (UID: "2a7c36e3-249c-4ad2-ab08-5b00b3f71218"). InnerVolumeSpecName "kube-api-access-7qx4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.707384 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a7c36e3-249c-4ad2-ab08-5b00b3f71218-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.707415 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qx4l\" (UniqueName: \"kubernetes.io/projected/2a7c36e3-249c-4ad2-ab08-5b00b3f71218-kube-api-access-7qx4l\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.711070 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a7c36e3-249c-4ad2-ab08-5b00b3f71218-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a7c36e3-249c-4ad2-ab08-5b00b3f71218" (UID: "2a7c36e3-249c-4ad2-ab08-5b00b3f71218"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.801809 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb" path="/var/lib/kubelet/pods/6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb/volumes" Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.808314 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a7c36e3-249c-4ad2-ab08-5b00b3f71218-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.824241 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zgk7n"] Oct 07 19:04:07 crc kubenswrapper[4825]: I1007 19:04:07.827405 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zgk7n"] Oct 07 19:04:09 crc kubenswrapper[4825]: I1007 19:04:09.803620 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a7c36e3-249c-4ad2-ab08-5b00b3f71218" path="/var/lib/kubelet/pods/2a7c36e3-249c-4ad2-ab08-5b00b3f71218/volumes" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.190052 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" podUID="7c0bb39b-ac5f-48e5-87a8-80f21b338c02" containerName="oauth-openshift" containerID="cri-o://c081e68101ef34da84363844f03f7e12d755d33d0cea5115d38a1c72b3df182d" gracePeriod=15 Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.586529 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.620643 4825 generic.go:334] "Generic (PLEG): container finished" podID="7c0bb39b-ac5f-48e5-87a8-80f21b338c02" containerID="c081e68101ef34da84363844f03f7e12d755d33d0cea5115d38a1c72b3df182d" exitCode=0 Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.620710 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" event={"ID":"7c0bb39b-ac5f-48e5-87a8-80f21b338c02","Type":"ContainerDied","Data":"c081e68101ef34da84363844f03f7e12d755d33d0cea5115d38a1c72b3df182d"} Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.620750 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" event={"ID":"7c0bb39b-ac5f-48e5-87a8-80f21b338c02","Type":"ContainerDied","Data":"e3b795932049d99605f3d7812391526232b474db7a7a1128678e64b70335d5b9"} Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.620780 4825 scope.go:117] "RemoveContainer" containerID="c081e68101ef34da84363844f03f7e12d755d33d0cea5115d38a1c72b3df182d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.620972 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-x74mv" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.638089 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5c8f64895f-s8p9d"] Oct 07 19:04:27 crc kubenswrapper[4825]: E1007 19:04:27.638382 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ca5c5b3-f474-4650-9e03-c20849b4f03b" containerName="extract-utilities" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.638399 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ca5c5b3-f474-4650-9e03-c20849b4f03b" containerName="extract-utilities" Oct 07 19:04:27 crc kubenswrapper[4825]: E1007 19:04:27.638416 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaa5476c-840a-48a8-b6d5-6ced96751e74" containerName="pruner" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.638424 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaa5476c-840a-48a8-b6d5-6ced96751e74" containerName="pruner" Oct 07 19:04:27 crc kubenswrapper[4825]: E1007 19:04:27.638432 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ea1416f-9faf-428d-bf84-9308f267669e" containerName="registry-server" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.638440 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea1416f-9faf-428d-bf84-9308f267669e" containerName="registry-server" Oct 07 19:04:27 crc kubenswrapper[4825]: E1007 19:04:27.638451 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ca5c5b3-f474-4650-9e03-c20849b4f03b" containerName="registry-server" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.638459 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ca5c5b3-f474-4650-9e03-c20849b4f03b" containerName="registry-server" Oct 07 19:04:27 crc kubenswrapper[4825]: E1007 19:04:27.638468 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb" containerName="extract-utilities" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.638475 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb" containerName="extract-utilities" Oct 07 19:04:27 crc kubenswrapper[4825]: E1007 19:04:27.638486 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb" containerName="extract-content" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.638493 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb" containerName="extract-content" Oct 07 19:04:27 crc kubenswrapper[4825]: E1007 19:04:27.638504 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ea1416f-9faf-428d-bf84-9308f267669e" containerName="extract-utilities" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.638512 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea1416f-9faf-428d-bf84-9308f267669e" containerName="extract-utilities" Oct 07 19:04:27 crc kubenswrapper[4825]: E1007 19:04:27.638526 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a7c36e3-249c-4ad2-ab08-5b00b3f71218" containerName="extract-utilities" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.638535 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7c36e3-249c-4ad2-ab08-5b00b3f71218" containerName="extract-utilities" Oct 07 19:04:27 crc kubenswrapper[4825]: E1007 19:04:27.638549 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb" containerName="registry-server" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.638557 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb" containerName="registry-server" Oct 07 19:04:27 crc kubenswrapper[4825]: E1007 19:04:27.638567 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c0bb39b-ac5f-48e5-87a8-80f21b338c02" containerName="oauth-openshift" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.638576 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0bb39b-ac5f-48e5-87a8-80f21b338c02" containerName="oauth-openshift" Oct 07 19:04:27 crc kubenswrapper[4825]: E1007 19:04:27.638588 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a7c36e3-249c-4ad2-ab08-5b00b3f71218" containerName="registry-server" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.638596 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7c36e3-249c-4ad2-ab08-5b00b3f71218" containerName="registry-server" Oct 07 19:04:27 crc kubenswrapper[4825]: E1007 19:04:27.638609 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ca5c5b3-f474-4650-9e03-c20849b4f03b" containerName="extract-content" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.638617 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ca5c5b3-f474-4650-9e03-c20849b4f03b" containerName="extract-content" Oct 07 19:04:27 crc kubenswrapper[4825]: E1007 19:04:27.638628 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ea1416f-9faf-428d-bf84-9308f267669e" containerName="extract-content" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.638635 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea1416f-9faf-428d-bf84-9308f267669e" containerName="extract-content" Oct 07 19:04:27 crc kubenswrapper[4825]: E1007 19:04:27.638646 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a7c36e3-249c-4ad2-ab08-5b00b3f71218" containerName="extract-content" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.638654 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7c36e3-249c-4ad2-ab08-5b00b3f71218" containerName="extract-content" Oct 07 19:04:27 crc kubenswrapper[4825]: E1007 19:04:27.638665 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ea549f0-6076-4cd3-a3d5-fa5274a8b37f" containerName="pruner" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.638672 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ea549f0-6076-4cd3-a3d5-fa5274a8b37f" containerName="pruner" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.638775 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c0bb39b-ac5f-48e5-87a8-80f21b338c02" containerName="oauth-openshift" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.638789 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ea1416f-9faf-428d-bf84-9308f267669e" containerName="registry-server" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.638802 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a5c54d0-f559-4e3d-8be1-5a0cb5293dfb" containerName="registry-server" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.638811 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ea549f0-6076-4cd3-a3d5-fa5274a8b37f" containerName="pruner" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.638821 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaa5476c-840a-48a8-b6d5-6ced96751e74" containerName="pruner" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.638830 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a7c36e3-249c-4ad2-ab08-5b00b3f71218" containerName="registry-server" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.638842 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ca5c5b3-f474-4650-9e03-c20849b4f03b" containerName="registry-server" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.639302 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.657040 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5c8f64895f-s8p9d"] Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.663548 4825 scope.go:117] "RemoveContainer" containerID="c081e68101ef34da84363844f03f7e12d755d33d0cea5115d38a1c72b3df182d" Oct 07 19:04:27 crc kubenswrapper[4825]: E1007 19:04:27.663969 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c081e68101ef34da84363844f03f7e12d755d33d0cea5115d38a1c72b3df182d\": container with ID starting with c081e68101ef34da84363844f03f7e12d755d33d0cea5115d38a1c72b3df182d not found: ID does not exist" containerID="c081e68101ef34da84363844f03f7e12d755d33d0cea5115d38a1c72b3df182d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.664003 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c081e68101ef34da84363844f03f7e12d755d33d0cea5115d38a1c72b3df182d"} err="failed to get container status \"c081e68101ef34da84363844f03f7e12d755d33d0cea5115d38a1c72b3df182d\": rpc error: code = NotFound desc = could not find container \"c081e68101ef34da84363844f03f7e12d755d33d0cea5115d38a1c72b3df182d\": container with ID starting with c081e68101ef34da84363844f03f7e12d755d33d0cea5115d38a1c72b3df182d not found: ID does not exist" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.703855 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-cliconfig\") pod \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.703909 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-trusted-ca-bundle\") pod \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.703937 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-service-ca\") pod \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.703971 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-user-idp-0-file-data\") pod \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.704044 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-session\") pod \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.704078 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-serving-cert\") pod \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.704127 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-user-template-error\") pod \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.704161 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-audit-policies\") pod \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.704183 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-router-certs\") pod \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.704204 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-audit-dir\") pod \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.704245 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzrp8\" (UniqueName: \"kubernetes.io/projected/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-kube-api-access-tzrp8\") pod \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.704268 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-user-template-login\") pod \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.704291 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-user-template-provider-selection\") pod \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.704317 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-ocp-branding-template\") pod \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\" (UID: \"7c0bb39b-ac5f-48e5-87a8-80f21b338c02\") " Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.704946 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "7c0bb39b-ac5f-48e5-87a8-80f21b338c02" (UID: "7c0bb39b-ac5f-48e5-87a8-80f21b338c02"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.704984 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "7c0bb39b-ac5f-48e5-87a8-80f21b338c02" (UID: "7c0bb39b-ac5f-48e5-87a8-80f21b338c02"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.705008 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "7c0bb39b-ac5f-48e5-87a8-80f21b338c02" (UID: "7c0bb39b-ac5f-48e5-87a8-80f21b338c02"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.705339 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "7c0bb39b-ac5f-48e5-87a8-80f21b338c02" (UID: "7c0bb39b-ac5f-48e5-87a8-80f21b338c02"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.707281 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "7c0bb39b-ac5f-48e5-87a8-80f21b338c02" (UID: "7c0bb39b-ac5f-48e5-87a8-80f21b338c02"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.712496 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-kube-api-access-tzrp8" (OuterVolumeSpecName: "kube-api-access-tzrp8") pod "7c0bb39b-ac5f-48e5-87a8-80f21b338c02" (UID: "7c0bb39b-ac5f-48e5-87a8-80f21b338c02"). InnerVolumeSpecName "kube-api-access-tzrp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.712636 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "7c0bb39b-ac5f-48e5-87a8-80f21b338c02" (UID: "7c0bb39b-ac5f-48e5-87a8-80f21b338c02"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.712801 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "7c0bb39b-ac5f-48e5-87a8-80f21b338c02" (UID: "7c0bb39b-ac5f-48e5-87a8-80f21b338c02"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.713150 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "7c0bb39b-ac5f-48e5-87a8-80f21b338c02" (UID: "7c0bb39b-ac5f-48e5-87a8-80f21b338c02"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.713575 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "7c0bb39b-ac5f-48e5-87a8-80f21b338c02" (UID: "7c0bb39b-ac5f-48e5-87a8-80f21b338c02"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.713762 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "7c0bb39b-ac5f-48e5-87a8-80f21b338c02" (UID: "7c0bb39b-ac5f-48e5-87a8-80f21b338c02"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.714331 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "7c0bb39b-ac5f-48e5-87a8-80f21b338c02" (UID: "7c0bb39b-ac5f-48e5-87a8-80f21b338c02"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.714760 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "7c0bb39b-ac5f-48e5-87a8-80f21b338c02" (UID: "7c0bb39b-ac5f-48e5-87a8-80f21b338c02"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.715000 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "7c0bb39b-ac5f-48e5-87a8-80f21b338c02" (UID: "7c0bb39b-ac5f-48e5-87a8-80f21b338c02"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.806448 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.806588 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.806641 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.806895 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-system-router-certs\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.807063 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.807135 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-system-session\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.807543 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-user-template-error\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.807601 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-system-service-ca\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.807628 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-user-template-login\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.807691 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxb7d\" (UniqueName: \"kubernetes.io/projected/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-kube-api-access-gxb7d\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.807756 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.807787 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-audit-dir\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.807812 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-audit-policies\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.807874 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.807968 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.807986 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.807999 4825 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.808012 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.808026 4825 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.808038 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzrp8\" (UniqueName: \"kubernetes.io/projected/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-kube-api-access-tzrp8\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.808068 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.808081 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.808094 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.808106 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.808117 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.808129 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.808140 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.808152 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c0bb39b-ac5f-48e5-87a8-80f21b338c02-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.908762 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-system-router-certs\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.908824 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.908851 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-system-session\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.908868 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-user-template-error\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.908899 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-system-service-ca\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.908917 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-user-template-login\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.908944 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxb7d\" (UniqueName: \"kubernetes.io/projected/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-kube-api-access-gxb7d\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.908962 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.908979 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-audit-dir\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.908996 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-audit-policies\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.909017 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.909040 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.909059 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.909075 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.909473 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-audit-dir\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.910337 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-system-service-ca\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.910739 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-audit-policies\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.910927 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.912618 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.913175 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.913326 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-user-template-login\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.913731 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-system-router-certs\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.913823 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-system-session\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.914510 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.915557 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-user-template-error\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.916198 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.916385 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.929918 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxb7d\" (UniqueName: \"kubernetes.io/projected/58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7-kube-api-access-gxb7d\") pod \"oauth-openshift-5c8f64895f-s8p9d\" (UID: \"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7\") " pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.946738 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-x74mv"] Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.952331 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-x74mv"] Oct 07 19:04:27 crc kubenswrapper[4825]: I1007 19:04:27.963068 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:28 crc kubenswrapper[4825]: I1007 19:04:28.393441 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5c8f64895f-s8p9d"] Oct 07 19:04:28 crc kubenswrapper[4825]: I1007 19:04:28.629876 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" event={"ID":"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7","Type":"ContainerStarted","Data":"3bd295759eb5756410b970c804f622b794bbfecc9158e0f47a35a22f403b4902"} Oct 07 19:04:29 crc kubenswrapper[4825]: I1007 19:04:29.640584 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" event={"ID":"58f2ca2f-0305-4c7c-8ebd-b2cffd2ae6a7","Type":"ContainerStarted","Data":"5a584feb8b01a099560a4422d683565999c5f791ec897ba813b4d204d97458dd"} Oct 07 19:04:29 crc kubenswrapper[4825]: I1007 19:04:29.640997 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:29 crc kubenswrapper[4825]: I1007 19:04:29.651814 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" Oct 07 19:04:29 crc kubenswrapper[4825]: I1007 19:04:29.679381 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5c8f64895f-s8p9d" podStartSLOduration=27.679349472 podStartE2EDuration="27.679349472s" podCreationTimestamp="2025-10-07 19:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:04:29.674394401 +0000 UTC m=+258.496433108" watchObservedRunningTime="2025-10-07 19:04:29.679349472 +0000 UTC m=+258.501388149" Oct 07 19:04:29 crc kubenswrapper[4825]: I1007 19:04:29.808829 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c0bb39b-ac5f-48e5-87a8-80f21b338c02" path="/var/lib/kubelet/pods/7c0bb39b-ac5f-48e5-87a8-80f21b338c02/volumes" Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.562790 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5shkc"] Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.563570 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5shkc" podUID="43171f5c-ea7f-43d8-bdec-0d8f5b5c907c" containerName="registry-server" containerID="cri-o://11150b0418c0c3eb143ab15458b16fe7d960c15a40224d0f143611fc36fa24bf" gracePeriod=30 Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.573349 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nww4f"] Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.573553 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nww4f" podUID="37844b25-13d2-4bd7-8807-35c4bc1a4dde" containerName="registry-server" containerID="cri-o://8e6b9a760b362579156ba326f0e9601ca4c22772dd1973e1b91856bb327cf2d3" gracePeriod=30 Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.616389 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9rmtb"] Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.616918 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-9rmtb" podUID="3950d620-3e88-48fd-823a-f0ab8772ff5b" containerName="marketplace-operator" containerID="cri-o://58db47d59ece8247419d04606990a836c33261d0b4d7baf611bfbaa951480e80" gracePeriod=30 Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.624257 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5gwml"] Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.624641 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5gwml" podUID="fa9407d2-7436-4a1e-82ef-babe5b4db5e9" containerName="registry-server" containerID="cri-o://7b90ce285a9f323dc10ce2c110cfe75a23531b69b4de2cab6cead28af13d7e06" gracePeriod=30 Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.632523 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x8pgw"] Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.632920 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x8pgw" podUID="d980b20c-41bd-4aff-9a22-2e806ce8d5cf" containerName="registry-server" containerID="cri-o://67a33d2ca611a516fca2429cec53b837faa2b4da3ff0a88fe332be7bcb10462c" gracePeriod=30 Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.634956 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kgrmp"] Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.636148 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kgrmp" Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.637234 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kgrmp"] Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.753022 4825 generic.go:334] "Generic (PLEG): container finished" podID="37844b25-13d2-4bd7-8807-35c4bc1a4dde" containerID="8e6b9a760b362579156ba326f0e9601ca4c22772dd1973e1b91856bb327cf2d3" exitCode=0 Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.753096 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nww4f" event={"ID":"37844b25-13d2-4bd7-8807-35c4bc1a4dde","Type":"ContainerDied","Data":"8e6b9a760b362579156ba326f0e9601ca4c22772dd1973e1b91856bb327cf2d3"} Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.764030 4825 generic.go:334] "Generic (PLEG): container finished" podID="fa9407d2-7436-4a1e-82ef-babe5b4db5e9" containerID="7b90ce285a9f323dc10ce2c110cfe75a23531b69b4de2cab6cead28af13d7e06" exitCode=0 Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.764159 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5gwml" event={"ID":"fa9407d2-7436-4a1e-82ef-babe5b4db5e9","Type":"ContainerDied","Data":"7b90ce285a9f323dc10ce2c110cfe75a23531b69b4de2cab6cead28af13d7e06"} Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.768063 4825 generic.go:334] "Generic (PLEG): container finished" podID="3950d620-3e88-48fd-823a-f0ab8772ff5b" containerID="58db47d59ece8247419d04606990a836c33261d0b4d7baf611bfbaa951480e80" exitCode=0 Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.768110 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9rmtb" event={"ID":"3950d620-3e88-48fd-823a-f0ab8772ff5b","Type":"ContainerDied","Data":"58db47d59ece8247419d04606990a836c33261d0b4d7baf611bfbaa951480e80"} Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.774825 4825 generic.go:334] "Generic (PLEG): container finished" podID="43171f5c-ea7f-43d8-bdec-0d8f5b5c907c" containerID="11150b0418c0c3eb143ab15458b16fe7d960c15a40224d0f143611fc36fa24bf" exitCode=0 Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.774878 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5shkc" event={"ID":"43171f5c-ea7f-43d8-bdec-0d8f5b5c907c","Type":"ContainerDied","Data":"11150b0418c0c3eb143ab15458b16fe7d960c15a40224d0f143611fc36fa24bf"} Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.806056 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69920aad-eedb-4eca-887a-8f3225bff52b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kgrmp\" (UID: \"69920aad-eedb-4eca-887a-8f3225bff52b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kgrmp" Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.806145 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/69920aad-eedb-4eca-887a-8f3225bff52b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kgrmp\" (UID: \"69920aad-eedb-4eca-887a-8f3225bff52b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kgrmp" Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.806186 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf995\" (UniqueName: \"kubernetes.io/projected/69920aad-eedb-4eca-887a-8f3225bff52b-kube-api-access-hf995\") pod \"marketplace-operator-79b997595-kgrmp\" (UID: \"69920aad-eedb-4eca-887a-8f3225bff52b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kgrmp" Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.907789 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69920aad-eedb-4eca-887a-8f3225bff52b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kgrmp\" (UID: \"69920aad-eedb-4eca-887a-8f3225bff52b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kgrmp" Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.907877 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/69920aad-eedb-4eca-887a-8f3225bff52b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kgrmp\" (UID: \"69920aad-eedb-4eca-887a-8f3225bff52b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kgrmp" Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.907916 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf995\" (UniqueName: \"kubernetes.io/projected/69920aad-eedb-4eca-887a-8f3225bff52b-kube-api-access-hf995\") pod \"marketplace-operator-79b997595-kgrmp\" (UID: \"69920aad-eedb-4eca-887a-8f3225bff52b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kgrmp" Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.909949 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69920aad-eedb-4eca-887a-8f3225bff52b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kgrmp\" (UID: \"69920aad-eedb-4eca-887a-8f3225bff52b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kgrmp" Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.914800 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/69920aad-eedb-4eca-887a-8f3225bff52b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kgrmp\" (UID: \"69920aad-eedb-4eca-887a-8f3225bff52b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kgrmp" Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.925168 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf995\" (UniqueName: \"kubernetes.io/projected/69920aad-eedb-4eca-887a-8f3225bff52b-kube-api-access-hf995\") pod \"marketplace-operator-79b997595-kgrmp\" (UID: \"69920aad-eedb-4eca-887a-8f3225bff52b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kgrmp" Oct 07 19:04:47 crc kubenswrapper[4825]: I1007 19:04:47.993747 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kgrmp" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.002542 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5shkc" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.109960 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nww4f" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.111448 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vxmn\" (UniqueName: \"kubernetes.io/projected/43171f5c-ea7f-43d8-bdec-0d8f5b5c907c-kube-api-access-5vxmn\") pod \"43171f5c-ea7f-43d8-bdec-0d8f5b5c907c\" (UID: \"43171f5c-ea7f-43d8-bdec-0d8f5b5c907c\") " Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.111497 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43171f5c-ea7f-43d8-bdec-0d8f5b5c907c-utilities\") pod \"43171f5c-ea7f-43d8-bdec-0d8f5b5c907c\" (UID: \"43171f5c-ea7f-43d8-bdec-0d8f5b5c907c\") " Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.111572 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43171f5c-ea7f-43d8-bdec-0d8f5b5c907c-catalog-content\") pod \"43171f5c-ea7f-43d8-bdec-0d8f5b5c907c\" (UID: \"43171f5c-ea7f-43d8-bdec-0d8f5b5c907c\") " Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.112591 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43171f5c-ea7f-43d8-bdec-0d8f5b5c907c-utilities" (OuterVolumeSpecName: "utilities") pod "43171f5c-ea7f-43d8-bdec-0d8f5b5c907c" (UID: "43171f5c-ea7f-43d8-bdec-0d8f5b5c907c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.113025 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x8pgw" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.114311 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43171f5c-ea7f-43d8-bdec-0d8f5b5c907c-kube-api-access-5vxmn" (OuterVolumeSpecName: "kube-api-access-5vxmn") pod "43171f5c-ea7f-43d8-bdec-0d8f5b5c907c" (UID: "43171f5c-ea7f-43d8-bdec-0d8f5b5c907c"). InnerVolumeSpecName "kube-api-access-5vxmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.128134 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9rmtb" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.129058 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5gwml" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.214558 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37844b25-13d2-4bd7-8807-35c4bc1a4dde-catalog-content\") pod \"37844b25-13d2-4bd7-8807-35c4bc1a4dde\" (UID: \"37844b25-13d2-4bd7-8807-35c4bc1a4dde\") " Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.214645 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwttq\" (UniqueName: \"kubernetes.io/projected/37844b25-13d2-4bd7-8807-35c4bc1a4dde-kube-api-access-kwttq\") pod \"37844b25-13d2-4bd7-8807-35c4bc1a4dde\" (UID: \"37844b25-13d2-4bd7-8807-35c4bc1a4dde\") " Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.214713 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37844b25-13d2-4bd7-8807-35c4bc1a4dde-utilities\") pod \"37844b25-13d2-4bd7-8807-35c4bc1a4dde\" (UID: \"37844b25-13d2-4bd7-8807-35c4bc1a4dde\") " Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.215571 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37844b25-13d2-4bd7-8807-35c4bc1a4dde-utilities" (OuterVolumeSpecName: "utilities") pod "37844b25-13d2-4bd7-8807-35c4bc1a4dde" (UID: "37844b25-13d2-4bd7-8807-35c4bc1a4dde"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.216809 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37844b25-13d2-4bd7-8807-35c4bc1a4dde-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.216834 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vxmn\" (UniqueName: \"kubernetes.io/projected/43171f5c-ea7f-43d8-bdec-0d8f5b5c907c-kube-api-access-5vxmn\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.216848 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43171f5c-ea7f-43d8-bdec-0d8f5b5c907c-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.218954 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37844b25-13d2-4bd7-8807-35c4bc1a4dde-kube-api-access-kwttq" (OuterVolumeSpecName: "kube-api-access-kwttq") pod "37844b25-13d2-4bd7-8807-35c4bc1a4dde" (UID: "37844b25-13d2-4bd7-8807-35c4bc1a4dde"). InnerVolumeSpecName "kube-api-access-kwttq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.238917 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43171f5c-ea7f-43d8-bdec-0d8f5b5c907c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43171f5c-ea7f-43d8-bdec-0d8f5b5c907c" (UID: "43171f5c-ea7f-43d8-bdec-0d8f5b5c907c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.268781 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37844b25-13d2-4bd7-8807-35c4bc1a4dde-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37844b25-13d2-4bd7-8807-35c4bc1a4dde" (UID: "37844b25-13d2-4bd7-8807-35c4bc1a4dde"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.302356 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kgrmp"] Oct 07 19:04:48 crc kubenswrapper[4825]: W1007 19:04:48.309944 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69920aad_eedb_4eca_887a_8f3225bff52b.slice/crio-8e1c87ce5181f27fc33dd5c001b1a756d18e61b98795d512c368938b9ed6f9fd WatchSource:0}: Error finding container 8e1c87ce5181f27fc33dd5c001b1a756d18e61b98795d512c368938b9ed6f9fd: Status 404 returned error can't find the container with id 8e1c87ce5181f27fc33dd5c001b1a756d18e61b98795d512c368938b9ed6f9fd Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.317985 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2xhp\" (UniqueName: \"kubernetes.io/projected/3950d620-3e88-48fd-823a-f0ab8772ff5b-kube-api-access-p2xhp\") pod \"3950d620-3e88-48fd-823a-f0ab8772ff5b\" (UID: \"3950d620-3e88-48fd-823a-f0ab8772ff5b\") " Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.318068 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3950d620-3e88-48fd-823a-f0ab8772ff5b-marketplace-trusted-ca\") pod \"3950d620-3e88-48fd-823a-f0ab8772ff5b\" (UID: \"3950d620-3e88-48fd-823a-f0ab8772ff5b\") " Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.318130 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3950d620-3e88-48fd-823a-f0ab8772ff5b-marketplace-operator-metrics\") pod \"3950d620-3e88-48fd-823a-f0ab8772ff5b\" (UID: \"3950d620-3e88-48fd-823a-f0ab8772ff5b\") " Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.318160 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d980b20c-41bd-4aff-9a22-2e806ce8d5cf-catalog-content\") pod \"d980b20c-41bd-4aff-9a22-2e806ce8d5cf\" (UID: \"d980b20c-41bd-4aff-9a22-2e806ce8d5cf\") " Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.318197 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x4gm\" (UniqueName: \"kubernetes.io/projected/d980b20c-41bd-4aff-9a22-2e806ce8d5cf-kube-api-access-2x4gm\") pod \"d980b20c-41bd-4aff-9a22-2e806ce8d5cf\" (UID: \"d980b20c-41bd-4aff-9a22-2e806ce8d5cf\") " Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.318243 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d980b20c-41bd-4aff-9a22-2e806ce8d5cf-utilities\") pod \"d980b20c-41bd-4aff-9a22-2e806ce8d5cf\" (UID: \"d980b20c-41bd-4aff-9a22-2e806ce8d5cf\") " Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.318268 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vsrp\" (UniqueName: \"kubernetes.io/projected/fa9407d2-7436-4a1e-82ef-babe5b4db5e9-kube-api-access-7vsrp\") pod \"fa9407d2-7436-4a1e-82ef-babe5b4db5e9\" (UID: \"fa9407d2-7436-4a1e-82ef-babe5b4db5e9\") " Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.318300 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9407d2-7436-4a1e-82ef-babe5b4db5e9-catalog-content\") pod \"fa9407d2-7436-4a1e-82ef-babe5b4db5e9\" (UID: \"fa9407d2-7436-4a1e-82ef-babe5b4db5e9\") " Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.318341 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9407d2-7436-4a1e-82ef-babe5b4db5e9-utilities\") pod \"fa9407d2-7436-4a1e-82ef-babe5b4db5e9\" (UID: \"fa9407d2-7436-4a1e-82ef-babe5b4db5e9\") " Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.318596 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37844b25-13d2-4bd7-8807-35c4bc1a4dde-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.318654 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwttq\" (UniqueName: \"kubernetes.io/projected/37844b25-13d2-4bd7-8807-35c4bc1a4dde-kube-api-access-kwttq\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.318669 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43171f5c-ea7f-43d8-bdec-0d8f5b5c907c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.319653 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3950d620-3e88-48fd-823a-f0ab8772ff5b-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "3950d620-3e88-48fd-823a-f0ab8772ff5b" (UID: "3950d620-3e88-48fd-823a-f0ab8772ff5b"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.319714 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa9407d2-7436-4a1e-82ef-babe5b4db5e9-utilities" (OuterVolumeSpecName: "utilities") pod "fa9407d2-7436-4a1e-82ef-babe5b4db5e9" (UID: "fa9407d2-7436-4a1e-82ef-babe5b4db5e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.319738 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d980b20c-41bd-4aff-9a22-2e806ce8d5cf-utilities" (OuterVolumeSpecName: "utilities") pod "d980b20c-41bd-4aff-9a22-2e806ce8d5cf" (UID: "d980b20c-41bd-4aff-9a22-2e806ce8d5cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.321717 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3950d620-3e88-48fd-823a-f0ab8772ff5b-kube-api-access-p2xhp" (OuterVolumeSpecName: "kube-api-access-p2xhp") pod "3950d620-3e88-48fd-823a-f0ab8772ff5b" (UID: "3950d620-3e88-48fd-823a-f0ab8772ff5b"). InnerVolumeSpecName "kube-api-access-p2xhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.322600 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa9407d2-7436-4a1e-82ef-babe5b4db5e9-kube-api-access-7vsrp" (OuterVolumeSpecName: "kube-api-access-7vsrp") pod "fa9407d2-7436-4a1e-82ef-babe5b4db5e9" (UID: "fa9407d2-7436-4a1e-82ef-babe5b4db5e9"). InnerVolumeSpecName "kube-api-access-7vsrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.323490 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d980b20c-41bd-4aff-9a22-2e806ce8d5cf-kube-api-access-2x4gm" (OuterVolumeSpecName: "kube-api-access-2x4gm") pod "d980b20c-41bd-4aff-9a22-2e806ce8d5cf" (UID: "d980b20c-41bd-4aff-9a22-2e806ce8d5cf"). InnerVolumeSpecName "kube-api-access-2x4gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.326910 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3950d620-3e88-48fd-823a-f0ab8772ff5b-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "3950d620-3e88-48fd-823a-f0ab8772ff5b" (UID: "3950d620-3e88-48fd-823a-f0ab8772ff5b"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.336095 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa9407d2-7436-4a1e-82ef-babe5b4db5e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa9407d2-7436-4a1e-82ef-babe5b4db5e9" (UID: "fa9407d2-7436-4a1e-82ef-babe5b4db5e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.419692 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2xhp\" (UniqueName: \"kubernetes.io/projected/3950d620-3e88-48fd-823a-f0ab8772ff5b-kube-api-access-p2xhp\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.419732 4825 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3950d620-3e88-48fd-823a-f0ab8772ff5b-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.419744 4825 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3950d620-3e88-48fd-823a-f0ab8772ff5b-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.419758 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x4gm\" (UniqueName: \"kubernetes.io/projected/d980b20c-41bd-4aff-9a22-2e806ce8d5cf-kube-api-access-2x4gm\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.419771 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d980b20c-41bd-4aff-9a22-2e806ce8d5cf-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.419784 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vsrp\" (UniqueName: \"kubernetes.io/projected/fa9407d2-7436-4a1e-82ef-babe5b4db5e9-kube-api-access-7vsrp\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.419795 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9407d2-7436-4a1e-82ef-babe5b4db5e9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.419833 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9407d2-7436-4a1e-82ef-babe5b4db5e9-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.420404 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d980b20c-41bd-4aff-9a22-2e806ce8d5cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d980b20c-41bd-4aff-9a22-2e806ce8d5cf" (UID: "d980b20c-41bd-4aff-9a22-2e806ce8d5cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.521846 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d980b20c-41bd-4aff-9a22-2e806ce8d5cf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.781302 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kgrmp" event={"ID":"69920aad-eedb-4eca-887a-8f3225bff52b","Type":"ContainerStarted","Data":"70f153118c9370356d5e9c8f7bcabea1554cc9bdcb6ed817911dfe4ac6325a8b"} Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.781349 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kgrmp" event={"ID":"69920aad-eedb-4eca-887a-8f3225bff52b","Type":"ContainerStarted","Data":"8e1c87ce5181f27fc33dd5c001b1a756d18e61b98795d512c368938b9ed6f9fd"} Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.783547 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-kgrmp" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.785767 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5gwml" event={"ID":"fa9407d2-7436-4a1e-82ef-babe5b4db5e9","Type":"ContainerDied","Data":"6592aaa1b84a14913888ae75e25064de5b9fc43701ea1c96f99b8008f87dc92a"} Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.785817 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-kgrmp" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.785830 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5gwml" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.785835 4825 scope.go:117] "RemoveContainer" containerID="7b90ce285a9f323dc10ce2c110cfe75a23531b69b4de2cab6cead28af13d7e06" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.787345 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9rmtb" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.791296 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9rmtb" event={"ID":"3950d620-3e88-48fd-823a-f0ab8772ff5b","Type":"ContainerDied","Data":"06aee62d4b265805a18886720fae2dcfa719d50ec4ee1f62f80e889f8d6bd5fb"} Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.794764 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5shkc" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.794809 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5shkc" event={"ID":"43171f5c-ea7f-43d8-bdec-0d8f5b5c907c","Type":"ContainerDied","Data":"a019c6e70fccd0176f9118bb54f5fd852eb2c2a03bdaa7939f40f44d9e3bbbec"} Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.796430 4825 generic.go:334] "Generic (PLEG): container finished" podID="d980b20c-41bd-4aff-9a22-2e806ce8d5cf" containerID="67a33d2ca611a516fca2429cec53b837faa2b4da3ff0a88fe332be7bcb10462c" exitCode=0 Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.796487 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x8pgw" event={"ID":"d980b20c-41bd-4aff-9a22-2e806ce8d5cf","Type":"ContainerDied","Data":"67a33d2ca611a516fca2429cec53b837faa2b4da3ff0a88fe332be7bcb10462c"} Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.796505 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x8pgw" event={"ID":"d980b20c-41bd-4aff-9a22-2e806ce8d5cf","Type":"ContainerDied","Data":"b6dee80d25b6cbaf4b6435f1618b1a1c1820f72ae85cd3caab89a4c3aa8f59f9"} Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.796558 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x8pgw" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.801981 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-kgrmp" podStartSLOduration=1.801961581 podStartE2EDuration="1.801961581s" podCreationTimestamp="2025-10-07 19:04:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:04:48.801677382 +0000 UTC m=+277.623716019" watchObservedRunningTime="2025-10-07 19:04:48.801961581 +0000 UTC m=+277.624000218" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.818439 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nww4f" event={"ID":"37844b25-13d2-4bd7-8807-35c4bc1a4dde","Type":"ContainerDied","Data":"9956d237bd1f499c5d783b37416ae05ddb4ace77fb6bcceda4cb7d28ef21b9a5"} Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.818539 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nww4f" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.819882 4825 scope.go:117] "RemoveContainer" containerID="f021f25ab41bdcd6000599d9d268545df67acbc3ed9ad87989426f560eb2e1a4" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.846641 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5gwml"] Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.849863 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5gwml"] Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.864927 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5shkc"] Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.873197 4825 scope.go:117] "RemoveContainer" containerID="105f5dd179dee36d5c929d2e113fc45bde45d16b7c9d39f94665e86530e7e71a" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.873363 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5shkc"] Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.876840 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9rmtb"] Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.879339 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9rmtb"] Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.885692 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x8pgw"] Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.888091 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x8pgw"] Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.896706 4825 scope.go:117] "RemoveContainer" containerID="58db47d59ece8247419d04606990a836c33261d0b4d7baf611bfbaa951480e80" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.904707 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nww4f"] Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.912055 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nww4f"] Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.917355 4825 scope.go:117] "RemoveContainer" containerID="11150b0418c0c3eb143ab15458b16fe7d960c15a40224d0f143611fc36fa24bf" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.938326 4825 scope.go:117] "RemoveContainer" containerID="ae8ddc3bf3aa7da78c3bfe329936d3903ece842324e51b7b2acfb7a75a1fd031" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.954544 4825 scope.go:117] "RemoveContainer" containerID="62fd5e093fc1ce72e2e7adfc6ae46f46e0561f73dce2281386fa61dfcc29eb49" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.971379 4825 scope.go:117] "RemoveContainer" containerID="67a33d2ca611a516fca2429cec53b837faa2b4da3ff0a88fe332be7bcb10462c" Oct 07 19:04:48 crc kubenswrapper[4825]: I1007 19:04:48.987313 4825 scope.go:117] "RemoveContainer" containerID="c8f2442a4f10a9fbb39a308dab018298a209283d40c1d50034232a9d91fa19f2" Oct 07 19:04:49 crc kubenswrapper[4825]: I1007 19:04:49.002003 4825 scope.go:117] "RemoveContainer" containerID="3eff291320f447cca77bb31b8dce3a7922a54e4936e3feb80594e1fd9ec70f97" Oct 07 19:04:49 crc kubenswrapper[4825]: I1007 19:04:49.014140 4825 scope.go:117] "RemoveContainer" containerID="67a33d2ca611a516fca2429cec53b837faa2b4da3ff0a88fe332be7bcb10462c" Oct 07 19:04:49 crc kubenswrapper[4825]: E1007 19:04:49.014904 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67a33d2ca611a516fca2429cec53b837faa2b4da3ff0a88fe332be7bcb10462c\": container with ID starting with 67a33d2ca611a516fca2429cec53b837faa2b4da3ff0a88fe332be7bcb10462c not found: ID does not exist" containerID="67a33d2ca611a516fca2429cec53b837faa2b4da3ff0a88fe332be7bcb10462c" Oct 07 19:04:49 crc kubenswrapper[4825]: I1007 19:04:49.014964 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67a33d2ca611a516fca2429cec53b837faa2b4da3ff0a88fe332be7bcb10462c"} err="failed to get container status \"67a33d2ca611a516fca2429cec53b837faa2b4da3ff0a88fe332be7bcb10462c\": rpc error: code = NotFound desc = could not find container \"67a33d2ca611a516fca2429cec53b837faa2b4da3ff0a88fe332be7bcb10462c\": container with ID starting with 67a33d2ca611a516fca2429cec53b837faa2b4da3ff0a88fe332be7bcb10462c not found: ID does not exist" Oct 07 19:04:49 crc kubenswrapper[4825]: I1007 19:04:49.015003 4825 scope.go:117] "RemoveContainer" containerID="c8f2442a4f10a9fbb39a308dab018298a209283d40c1d50034232a9d91fa19f2" Oct 07 19:04:49 crc kubenswrapper[4825]: E1007 19:04:49.015448 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8f2442a4f10a9fbb39a308dab018298a209283d40c1d50034232a9d91fa19f2\": container with ID starting with c8f2442a4f10a9fbb39a308dab018298a209283d40c1d50034232a9d91fa19f2 not found: ID does not exist" containerID="c8f2442a4f10a9fbb39a308dab018298a209283d40c1d50034232a9d91fa19f2" Oct 07 19:04:49 crc kubenswrapper[4825]: I1007 19:04:49.015493 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8f2442a4f10a9fbb39a308dab018298a209283d40c1d50034232a9d91fa19f2"} err="failed to get container status \"c8f2442a4f10a9fbb39a308dab018298a209283d40c1d50034232a9d91fa19f2\": rpc error: code = NotFound desc = could not find container \"c8f2442a4f10a9fbb39a308dab018298a209283d40c1d50034232a9d91fa19f2\": container with ID starting with c8f2442a4f10a9fbb39a308dab018298a209283d40c1d50034232a9d91fa19f2 not found: ID does not exist" Oct 07 19:04:49 crc kubenswrapper[4825]: I1007 19:04:49.015523 4825 scope.go:117] "RemoveContainer" containerID="3eff291320f447cca77bb31b8dce3a7922a54e4936e3feb80594e1fd9ec70f97" Oct 07 19:04:49 crc kubenswrapper[4825]: E1007 19:04:49.015924 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eff291320f447cca77bb31b8dce3a7922a54e4936e3feb80594e1fd9ec70f97\": container with ID starting with 3eff291320f447cca77bb31b8dce3a7922a54e4936e3feb80594e1fd9ec70f97 not found: ID does not exist" containerID="3eff291320f447cca77bb31b8dce3a7922a54e4936e3feb80594e1fd9ec70f97" Oct 07 19:04:49 crc kubenswrapper[4825]: I1007 19:04:49.015987 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eff291320f447cca77bb31b8dce3a7922a54e4936e3feb80594e1fd9ec70f97"} err="failed to get container status \"3eff291320f447cca77bb31b8dce3a7922a54e4936e3feb80594e1fd9ec70f97\": rpc error: code = NotFound desc = could not find container \"3eff291320f447cca77bb31b8dce3a7922a54e4936e3feb80594e1fd9ec70f97\": container with ID starting with 3eff291320f447cca77bb31b8dce3a7922a54e4936e3feb80594e1fd9ec70f97 not found: ID does not exist" Oct 07 19:04:49 crc kubenswrapper[4825]: I1007 19:04:49.016009 4825 scope.go:117] "RemoveContainer" containerID="8e6b9a760b362579156ba326f0e9601ca4c22772dd1973e1b91856bb327cf2d3" Oct 07 19:04:49 crc kubenswrapper[4825]: I1007 19:04:49.027749 4825 scope.go:117] "RemoveContainer" containerID="066e924a191f04a389e8db4d6617ba5de14dd7116153957f5b28913e6dcc29c2" Oct 07 19:04:49 crc kubenswrapper[4825]: I1007 19:04:49.042251 4825 scope.go:117] "RemoveContainer" containerID="00bc5e686ee488f1c5f671f83bb1a68a97ce2799650ed4baf9ea3a79bccde179" Oct 07 19:04:49 crc kubenswrapper[4825]: I1007 19:04:49.801924 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37844b25-13d2-4bd7-8807-35c4bc1a4dde" path="/var/lib/kubelet/pods/37844b25-13d2-4bd7-8807-35c4bc1a4dde/volumes" Oct 07 19:04:49 crc kubenswrapper[4825]: I1007 19:04:49.802950 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3950d620-3e88-48fd-823a-f0ab8772ff5b" path="/var/lib/kubelet/pods/3950d620-3e88-48fd-823a-f0ab8772ff5b/volumes" Oct 07 19:04:49 crc kubenswrapper[4825]: I1007 19:04:49.803603 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43171f5c-ea7f-43d8-bdec-0d8f5b5c907c" path="/var/lib/kubelet/pods/43171f5c-ea7f-43d8-bdec-0d8f5b5c907c/volumes" Oct 07 19:04:49 crc kubenswrapper[4825]: I1007 19:04:49.804968 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d980b20c-41bd-4aff-9a22-2e806ce8d5cf" path="/var/lib/kubelet/pods/d980b20c-41bd-4aff-9a22-2e806ce8d5cf/volumes" Oct 07 19:04:49 crc kubenswrapper[4825]: I1007 19:04:49.805730 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa9407d2-7436-4a1e-82ef-babe5b4db5e9" path="/var/lib/kubelet/pods/fa9407d2-7436-4a1e-82ef-babe5b4db5e9/volumes" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.575577 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jzchl"] Oct 07 19:04:50 crc kubenswrapper[4825]: E1007 19:04:50.576040 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3950d620-3e88-48fd-823a-f0ab8772ff5b" containerName="marketplace-operator" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.576146 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3950d620-3e88-48fd-823a-f0ab8772ff5b" containerName="marketplace-operator" Oct 07 19:04:50 crc kubenswrapper[4825]: E1007 19:04:50.576248 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43171f5c-ea7f-43d8-bdec-0d8f5b5c907c" containerName="extract-utilities" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.576364 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="43171f5c-ea7f-43d8-bdec-0d8f5b5c907c" containerName="extract-utilities" Oct 07 19:04:50 crc kubenswrapper[4825]: E1007 19:04:50.576460 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9407d2-7436-4a1e-82ef-babe5b4db5e9" containerName="extract-utilities" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.576559 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9407d2-7436-4a1e-82ef-babe5b4db5e9" containerName="extract-utilities" Oct 07 19:04:50 crc kubenswrapper[4825]: E1007 19:04:50.576772 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37844b25-13d2-4bd7-8807-35c4bc1a4dde" containerName="registry-server" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.576838 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="37844b25-13d2-4bd7-8807-35c4bc1a4dde" containerName="registry-server" Oct 07 19:04:50 crc kubenswrapper[4825]: E1007 19:04:50.576895 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d980b20c-41bd-4aff-9a22-2e806ce8d5cf" containerName="registry-server" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.576946 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d980b20c-41bd-4aff-9a22-2e806ce8d5cf" containerName="registry-server" Oct 07 19:04:50 crc kubenswrapper[4825]: E1007 19:04:50.577008 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d980b20c-41bd-4aff-9a22-2e806ce8d5cf" containerName="extract-content" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.577066 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d980b20c-41bd-4aff-9a22-2e806ce8d5cf" containerName="extract-content" Oct 07 19:04:50 crc kubenswrapper[4825]: E1007 19:04:50.577128 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d980b20c-41bd-4aff-9a22-2e806ce8d5cf" containerName="extract-utilities" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.577193 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d980b20c-41bd-4aff-9a22-2e806ce8d5cf" containerName="extract-utilities" Oct 07 19:04:50 crc kubenswrapper[4825]: E1007 19:04:50.577274 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43171f5c-ea7f-43d8-bdec-0d8f5b5c907c" containerName="registry-server" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.579289 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="43171f5c-ea7f-43d8-bdec-0d8f5b5c907c" containerName="registry-server" Oct 07 19:04:50 crc kubenswrapper[4825]: E1007 19:04:50.579380 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43171f5c-ea7f-43d8-bdec-0d8f5b5c907c" containerName="extract-content" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.579461 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="43171f5c-ea7f-43d8-bdec-0d8f5b5c907c" containerName="extract-content" Oct 07 19:04:50 crc kubenswrapper[4825]: E1007 19:04:50.579621 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9407d2-7436-4a1e-82ef-babe5b4db5e9" containerName="extract-content" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.579687 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9407d2-7436-4a1e-82ef-babe5b4db5e9" containerName="extract-content" Oct 07 19:04:50 crc kubenswrapper[4825]: E1007 19:04:50.579744 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9407d2-7436-4a1e-82ef-babe5b4db5e9" containerName="registry-server" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.579799 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9407d2-7436-4a1e-82ef-babe5b4db5e9" containerName="registry-server" Oct 07 19:04:50 crc kubenswrapper[4825]: E1007 19:04:50.579855 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37844b25-13d2-4bd7-8807-35c4bc1a4dde" containerName="extract-utilities" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.579914 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="37844b25-13d2-4bd7-8807-35c4bc1a4dde" containerName="extract-utilities" Oct 07 19:04:50 crc kubenswrapper[4825]: E1007 19:04:50.579977 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37844b25-13d2-4bd7-8807-35c4bc1a4dde" containerName="extract-content" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.580042 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="37844b25-13d2-4bd7-8807-35c4bc1a4dde" containerName="extract-content" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.580218 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3950d620-3e88-48fd-823a-f0ab8772ff5b" containerName="marketplace-operator" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.580327 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="43171f5c-ea7f-43d8-bdec-0d8f5b5c907c" containerName="registry-server" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.580394 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa9407d2-7436-4a1e-82ef-babe5b4db5e9" containerName="registry-server" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.580456 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="d980b20c-41bd-4aff-9a22-2e806ce8d5cf" containerName="registry-server" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.580514 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="37844b25-13d2-4bd7-8807-35c4bc1a4dde" containerName="registry-server" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.581287 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jzchl" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.583770 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.590913 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jzchl"] Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.769911 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n9p5\" (UniqueName: \"kubernetes.io/projected/1a126e8e-4603-49da-a888-c12dba592af6-kube-api-access-7n9p5\") pod \"redhat-marketplace-jzchl\" (UID: \"1a126e8e-4603-49da-a888-c12dba592af6\") " pod="openshift-marketplace/redhat-marketplace-jzchl" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.769957 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a126e8e-4603-49da-a888-c12dba592af6-utilities\") pod \"redhat-marketplace-jzchl\" (UID: \"1a126e8e-4603-49da-a888-c12dba592af6\") " pod="openshift-marketplace/redhat-marketplace-jzchl" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.770023 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a126e8e-4603-49da-a888-c12dba592af6-catalog-content\") pod \"redhat-marketplace-jzchl\" (UID: \"1a126e8e-4603-49da-a888-c12dba592af6\") " pod="openshift-marketplace/redhat-marketplace-jzchl" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.871607 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a126e8e-4603-49da-a888-c12dba592af6-catalog-content\") pod \"redhat-marketplace-jzchl\" (UID: \"1a126e8e-4603-49da-a888-c12dba592af6\") " pod="openshift-marketplace/redhat-marketplace-jzchl" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.871851 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n9p5\" (UniqueName: \"kubernetes.io/projected/1a126e8e-4603-49da-a888-c12dba592af6-kube-api-access-7n9p5\") pod \"redhat-marketplace-jzchl\" (UID: \"1a126e8e-4603-49da-a888-c12dba592af6\") " pod="openshift-marketplace/redhat-marketplace-jzchl" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.871897 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a126e8e-4603-49da-a888-c12dba592af6-utilities\") pod \"redhat-marketplace-jzchl\" (UID: \"1a126e8e-4603-49da-a888-c12dba592af6\") " pod="openshift-marketplace/redhat-marketplace-jzchl" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.872577 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a126e8e-4603-49da-a888-c12dba592af6-catalog-content\") pod \"redhat-marketplace-jzchl\" (UID: \"1a126e8e-4603-49da-a888-c12dba592af6\") " pod="openshift-marketplace/redhat-marketplace-jzchl" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.872800 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a126e8e-4603-49da-a888-c12dba592af6-utilities\") pod \"redhat-marketplace-jzchl\" (UID: \"1a126e8e-4603-49da-a888-c12dba592af6\") " pod="openshift-marketplace/redhat-marketplace-jzchl" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.895950 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n9p5\" (UniqueName: \"kubernetes.io/projected/1a126e8e-4603-49da-a888-c12dba592af6-kube-api-access-7n9p5\") pod \"redhat-marketplace-jzchl\" (UID: \"1a126e8e-4603-49da-a888-c12dba592af6\") " pod="openshift-marketplace/redhat-marketplace-jzchl" Oct 07 19:04:50 crc kubenswrapper[4825]: I1007 19:04:50.896433 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jzchl" Oct 07 19:04:51 crc kubenswrapper[4825]: I1007 19:04:51.116991 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jzchl"] Oct 07 19:04:51 crc kubenswrapper[4825]: I1007 19:04:51.185014 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xq5h6"] Oct 07 19:04:51 crc kubenswrapper[4825]: I1007 19:04:51.186925 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xq5h6" Oct 07 19:04:51 crc kubenswrapper[4825]: I1007 19:04:51.190346 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 07 19:04:51 crc kubenswrapper[4825]: I1007 19:04:51.198782 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xq5h6"] Oct 07 19:04:51 crc kubenswrapper[4825]: I1007 19:04:51.281393 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbac88b1-e1d0-432b-a57d-b73910086aa8-catalog-content\") pod \"redhat-operators-xq5h6\" (UID: \"cbac88b1-e1d0-432b-a57d-b73910086aa8\") " pod="openshift-marketplace/redhat-operators-xq5h6" Oct 07 19:04:51 crc kubenswrapper[4825]: I1007 19:04:51.281451 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbac88b1-e1d0-432b-a57d-b73910086aa8-utilities\") pod \"redhat-operators-xq5h6\" (UID: \"cbac88b1-e1d0-432b-a57d-b73910086aa8\") " pod="openshift-marketplace/redhat-operators-xq5h6" Oct 07 19:04:51 crc kubenswrapper[4825]: I1007 19:04:51.281691 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q87zs\" (UniqueName: \"kubernetes.io/projected/cbac88b1-e1d0-432b-a57d-b73910086aa8-kube-api-access-q87zs\") pod \"redhat-operators-xq5h6\" (UID: \"cbac88b1-e1d0-432b-a57d-b73910086aa8\") " pod="openshift-marketplace/redhat-operators-xq5h6" Oct 07 19:04:51 crc kubenswrapper[4825]: I1007 19:04:51.382575 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q87zs\" (UniqueName: \"kubernetes.io/projected/cbac88b1-e1d0-432b-a57d-b73910086aa8-kube-api-access-q87zs\") pod \"redhat-operators-xq5h6\" (UID: \"cbac88b1-e1d0-432b-a57d-b73910086aa8\") " pod="openshift-marketplace/redhat-operators-xq5h6" Oct 07 19:04:51 crc kubenswrapper[4825]: I1007 19:04:51.382639 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbac88b1-e1d0-432b-a57d-b73910086aa8-catalog-content\") pod \"redhat-operators-xq5h6\" (UID: \"cbac88b1-e1d0-432b-a57d-b73910086aa8\") " pod="openshift-marketplace/redhat-operators-xq5h6" Oct 07 19:04:51 crc kubenswrapper[4825]: I1007 19:04:51.382664 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbac88b1-e1d0-432b-a57d-b73910086aa8-utilities\") pod \"redhat-operators-xq5h6\" (UID: \"cbac88b1-e1d0-432b-a57d-b73910086aa8\") " pod="openshift-marketplace/redhat-operators-xq5h6" Oct 07 19:04:51 crc kubenswrapper[4825]: I1007 19:04:51.383714 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbac88b1-e1d0-432b-a57d-b73910086aa8-utilities\") pod \"redhat-operators-xq5h6\" (UID: \"cbac88b1-e1d0-432b-a57d-b73910086aa8\") " pod="openshift-marketplace/redhat-operators-xq5h6" Oct 07 19:04:51 crc kubenswrapper[4825]: I1007 19:04:51.383951 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbac88b1-e1d0-432b-a57d-b73910086aa8-catalog-content\") pod \"redhat-operators-xq5h6\" (UID: \"cbac88b1-e1d0-432b-a57d-b73910086aa8\") " pod="openshift-marketplace/redhat-operators-xq5h6" Oct 07 19:04:51 crc kubenswrapper[4825]: I1007 19:04:51.405122 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q87zs\" (UniqueName: \"kubernetes.io/projected/cbac88b1-e1d0-432b-a57d-b73910086aa8-kube-api-access-q87zs\") pod \"redhat-operators-xq5h6\" (UID: \"cbac88b1-e1d0-432b-a57d-b73910086aa8\") " pod="openshift-marketplace/redhat-operators-xq5h6" Oct 07 19:04:51 crc kubenswrapper[4825]: I1007 19:04:51.520856 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xq5h6" Oct 07 19:04:51 crc kubenswrapper[4825]: I1007 19:04:51.723241 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xq5h6"] Oct 07 19:04:51 crc kubenswrapper[4825]: W1007 19:04:51.732576 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbac88b1_e1d0_432b_a57d_b73910086aa8.slice/crio-261816cb7b4eef5fd1d8232ab4f01d724e340b0bd29dd2b3db3c8415bbc6b40b WatchSource:0}: Error finding container 261816cb7b4eef5fd1d8232ab4f01d724e340b0bd29dd2b3db3c8415bbc6b40b: Status 404 returned error can't find the container with id 261816cb7b4eef5fd1d8232ab4f01d724e340b0bd29dd2b3db3c8415bbc6b40b Oct 07 19:04:51 crc kubenswrapper[4825]: I1007 19:04:51.848144 4825 generic.go:334] "Generic (PLEG): container finished" podID="1a126e8e-4603-49da-a888-c12dba592af6" containerID="b6ca6df44dfa5f28e6d99693c50755e4f7dbcdb7668636119030e4534ead0bc6" exitCode=0 Oct 07 19:04:51 crc kubenswrapper[4825]: I1007 19:04:51.848203 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jzchl" event={"ID":"1a126e8e-4603-49da-a888-c12dba592af6","Type":"ContainerDied","Data":"b6ca6df44dfa5f28e6d99693c50755e4f7dbcdb7668636119030e4534ead0bc6"} Oct 07 19:04:51 crc kubenswrapper[4825]: I1007 19:04:51.848278 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jzchl" event={"ID":"1a126e8e-4603-49da-a888-c12dba592af6","Type":"ContainerStarted","Data":"4cd140fc2bd925f1c4c078b9cc3500c80ea31c26af69c70f6f296f58b602c257"} Oct 07 19:04:51 crc kubenswrapper[4825]: I1007 19:04:51.850518 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xq5h6" event={"ID":"cbac88b1-e1d0-432b-a57d-b73910086aa8","Type":"ContainerStarted","Data":"261816cb7b4eef5fd1d8232ab4f01d724e340b0bd29dd2b3db3c8415bbc6b40b"} Oct 07 19:04:52 crc kubenswrapper[4825]: I1007 19:04:52.857219 4825 generic.go:334] "Generic (PLEG): container finished" podID="1a126e8e-4603-49da-a888-c12dba592af6" containerID="2633b271db8289b36125c91974c16bc0181007bc9228f50361efb2ed14a78e4f" exitCode=0 Oct 07 19:04:52 crc kubenswrapper[4825]: I1007 19:04:52.857340 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jzchl" event={"ID":"1a126e8e-4603-49da-a888-c12dba592af6","Type":"ContainerDied","Data":"2633b271db8289b36125c91974c16bc0181007bc9228f50361efb2ed14a78e4f"} Oct 07 19:04:52 crc kubenswrapper[4825]: I1007 19:04:52.860124 4825 generic.go:334] "Generic (PLEG): container finished" podID="cbac88b1-e1d0-432b-a57d-b73910086aa8" containerID="6b670230c24af21246ca49fa624e313f781a8f602e0b5fea65312339fe6bf2f9" exitCode=0 Oct 07 19:04:52 crc kubenswrapper[4825]: I1007 19:04:52.860263 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xq5h6" event={"ID":"cbac88b1-e1d0-432b-a57d-b73910086aa8","Type":"ContainerDied","Data":"6b670230c24af21246ca49fa624e313f781a8f602e0b5fea65312339fe6bf2f9"} Oct 07 19:04:52 crc kubenswrapper[4825]: I1007 19:04:52.977160 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8bx2n"] Oct 07 19:04:52 crc kubenswrapper[4825]: I1007 19:04:52.979832 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8bx2n" Oct 07 19:04:52 crc kubenswrapper[4825]: I1007 19:04:52.982350 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 07 19:04:52 crc kubenswrapper[4825]: I1007 19:04:52.982587 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8bx2n"] Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.100798 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4-catalog-content\") pod \"community-operators-8bx2n\" (UID: \"6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4\") " pod="openshift-marketplace/community-operators-8bx2n" Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.100902 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4-utilities\") pod \"community-operators-8bx2n\" (UID: \"6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4\") " pod="openshift-marketplace/community-operators-8bx2n" Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.101507 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxtp8\" (UniqueName: \"kubernetes.io/projected/6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4-kube-api-access-lxtp8\") pod \"community-operators-8bx2n\" (UID: \"6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4\") " pod="openshift-marketplace/community-operators-8bx2n" Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.202739 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4-catalog-content\") pod \"community-operators-8bx2n\" (UID: \"6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4\") " pod="openshift-marketplace/community-operators-8bx2n" Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.202826 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4-utilities\") pod \"community-operators-8bx2n\" (UID: \"6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4\") " pod="openshift-marketplace/community-operators-8bx2n" Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.202880 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxtp8\" (UniqueName: \"kubernetes.io/projected/6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4-kube-api-access-lxtp8\") pod \"community-operators-8bx2n\" (UID: \"6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4\") " pod="openshift-marketplace/community-operators-8bx2n" Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.204201 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4-catalog-content\") pod \"community-operators-8bx2n\" (UID: \"6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4\") " pod="openshift-marketplace/community-operators-8bx2n" Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.204713 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4-utilities\") pod \"community-operators-8bx2n\" (UID: \"6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4\") " pod="openshift-marketplace/community-operators-8bx2n" Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.221570 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxtp8\" (UniqueName: \"kubernetes.io/projected/6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4-kube-api-access-lxtp8\") pod \"community-operators-8bx2n\" (UID: \"6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4\") " pod="openshift-marketplace/community-operators-8bx2n" Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.300686 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8bx2n" Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.479065 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8bx2n"] Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.582178 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mmrrt"] Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.583514 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmrrt" Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.585847 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.598383 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mmrrt"] Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.608127 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4d9fecf-f52d-4758-9a79-9c80afa25e80-utilities\") pod \"certified-operators-mmrrt\" (UID: \"d4d9fecf-f52d-4758-9a79-9c80afa25e80\") " pod="openshift-marketplace/certified-operators-mmrrt" Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.608287 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzwhm\" (UniqueName: \"kubernetes.io/projected/d4d9fecf-f52d-4758-9a79-9c80afa25e80-kube-api-access-xzwhm\") pod \"certified-operators-mmrrt\" (UID: \"d4d9fecf-f52d-4758-9a79-9c80afa25e80\") " pod="openshift-marketplace/certified-operators-mmrrt" Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.608390 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4d9fecf-f52d-4758-9a79-9c80afa25e80-catalog-content\") pod \"certified-operators-mmrrt\" (UID: \"d4d9fecf-f52d-4758-9a79-9c80afa25e80\") " pod="openshift-marketplace/certified-operators-mmrrt" Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.709090 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzwhm\" (UniqueName: \"kubernetes.io/projected/d4d9fecf-f52d-4758-9a79-9c80afa25e80-kube-api-access-xzwhm\") pod \"certified-operators-mmrrt\" (UID: \"d4d9fecf-f52d-4758-9a79-9c80afa25e80\") " pod="openshift-marketplace/certified-operators-mmrrt" Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.709177 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4d9fecf-f52d-4758-9a79-9c80afa25e80-catalog-content\") pod \"certified-operators-mmrrt\" (UID: \"d4d9fecf-f52d-4758-9a79-9c80afa25e80\") " pod="openshift-marketplace/certified-operators-mmrrt" Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.709216 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4d9fecf-f52d-4758-9a79-9c80afa25e80-utilities\") pod \"certified-operators-mmrrt\" (UID: \"d4d9fecf-f52d-4758-9a79-9c80afa25e80\") " pod="openshift-marketplace/certified-operators-mmrrt" Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.709669 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4d9fecf-f52d-4758-9a79-9c80afa25e80-catalog-content\") pod \"certified-operators-mmrrt\" (UID: \"d4d9fecf-f52d-4758-9a79-9c80afa25e80\") " pod="openshift-marketplace/certified-operators-mmrrt" Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.709692 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4d9fecf-f52d-4758-9a79-9c80afa25e80-utilities\") pod \"certified-operators-mmrrt\" (UID: \"d4d9fecf-f52d-4758-9a79-9c80afa25e80\") " pod="openshift-marketplace/certified-operators-mmrrt" Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.729669 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzwhm\" (UniqueName: \"kubernetes.io/projected/d4d9fecf-f52d-4758-9a79-9c80afa25e80-kube-api-access-xzwhm\") pod \"certified-operators-mmrrt\" (UID: \"d4d9fecf-f52d-4758-9a79-9c80afa25e80\") " pod="openshift-marketplace/certified-operators-mmrrt" Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.871070 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jzchl" event={"ID":"1a126e8e-4603-49da-a888-c12dba592af6","Type":"ContainerStarted","Data":"1cdaa2e0970f7bf19fcf639ce404d3ef5360ca5687d7c815c336a76f55a1993b"} Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.872788 4825 generic.go:334] "Generic (PLEG): container finished" podID="6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4" containerID="43376bc96c34a2b3cb3a36c0b395e8c4556d498089cc15df2bf945924b395099" exitCode=0 Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.872823 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bx2n" event={"ID":"6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4","Type":"ContainerDied","Data":"43376bc96c34a2b3cb3a36c0b395e8c4556d498089cc15df2bf945924b395099"} Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.872841 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bx2n" event={"ID":"6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4","Type":"ContainerStarted","Data":"b79ab1772e44b7da086161d1012a7d7e23f81ef854a456e72425390248f78bae"} Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.906829 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmrrt" Oct 07 19:04:53 crc kubenswrapper[4825]: I1007 19:04:53.911310 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jzchl" podStartSLOduration=2.088780275 podStartE2EDuration="3.9112863s" podCreationTimestamp="2025-10-07 19:04:50 +0000 UTC" firstStartedPulling="2025-10-07 19:04:51.85059678 +0000 UTC m=+280.672635417" lastFinishedPulling="2025-10-07 19:04:53.673102805 +0000 UTC m=+282.495141442" observedRunningTime="2025-10-07 19:04:53.886948127 +0000 UTC m=+282.708986804" watchObservedRunningTime="2025-10-07 19:04:53.9112863 +0000 UTC m=+282.733324937" Oct 07 19:04:54 crc kubenswrapper[4825]: I1007 19:04:54.376298 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mmrrt"] Oct 07 19:04:54 crc kubenswrapper[4825]: I1007 19:04:54.881974 4825 generic.go:334] "Generic (PLEG): container finished" podID="cbac88b1-e1d0-432b-a57d-b73910086aa8" containerID="f633d41a91275cf16cf068bd24fe658d9f534d75a5df0216d21c108f044dddc1" exitCode=0 Oct 07 19:04:54 crc kubenswrapper[4825]: I1007 19:04:54.882059 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xq5h6" event={"ID":"cbac88b1-e1d0-432b-a57d-b73910086aa8","Type":"ContainerDied","Data":"f633d41a91275cf16cf068bd24fe658d9f534d75a5df0216d21c108f044dddc1"} Oct 07 19:04:54 crc kubenswrapper[4825]: I1007 19:04:54.885188 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmrrt" event={"ID":"d4d9fecf-f52d-4758-9a79-9c80afa25e80","Type":"ContainerStarted","Data":"a239417df73bef6684cabc085844d04238ea8da3bcd844590503990c02a562d5"} Oct 07 19:04:55 crc kubenswrapper[4825]: I1007 19:04:55.892724 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xq5h6" event={"ID":"cbac88b1-e1d0-432b-a57d-b73910086aa8","Type":"ContainerStarted","Data":"d888f502b51552f0da582e82adc5cf2d93d0c642b4fe0fc62dee6b23487dcd08"} Oct 07 19:04:55 crc kubenswrapper[4825]: I1007 19:04:55.895055 4825 generic.go:334] "Generic (PLEG): container finished" podID="6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4" containerID="79a4d81830edf082b54c7b16af98bb9019505ff001ca119d05e529d107cbd539" exitCode=0 Oct 07 19:04:55 crc kubenswrapper[4825]: I1007 19:04:55.895140 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bx2n" event={"ID":"6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4","Type":"ContainerDied","Data":"79a4d81830edf082b54c7b16af98bb9019505ff001ca119d05e529d107cbd539"} Oct 07 19:04:55 crc kubenswrapper[4825]: I1007 19:04:55.897200 4825 generic.go:334] "Generic (PLEG): container finished" podID="d4d9fecf-f52d-4758-9a79-9c80afa25e80" containerID="c9f2132ae95e4498aae286d9d2972cf18f33f2848fe44945befe670e9f0ff553" exitCode=0 Oct 07 19:04:55 crc kubenswrapper[4825]: I1007 19:04:55.897272 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmrrt" event={"ID":"d4d9fecf-f52d-4758-9a79-9c80afa25e80","Type":"ContainerDied","Data":"c9f2132ae95e4498aae286d9d2972cf18f33f2848fe44945befe670e9f0ff553"} Oct 07 19:04:55 crc kubenswrapper[4825]: I1007 19:04:55.940102 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xq5h6" podStartSLOduration=2.187767448 podStartE2EDuration="4.940086772s" podCreationTimestamp="2025-10-07 19:04:51 +0000 UTC" firstStartedPulling="2025-10-07 19:04:52.86181239 +0000 UTC m=+281.683851027" lastFinishedPulling="2025-10-07 19:04:55.614131694 +0000 UTC m=+284.436170351" observedRunningTime="2025-10-07 19:04:55.915113999 +0000 UTC m=+284.737152676" watchObservedRunningTime="2025-10-07 19:04:55.940086772 +0000 UTC m=+284.762125409" Oct 07 19:04:56 crc kubenswrapper[4825]: I1007 19:04:56.908107 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bx2n" event={"ID":"6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4","Type":"ContainerStarted","Data":"5045471ec2dc1f7dc9915b4039d295d62f11fa66581d3331c12001a6a80c8936"} Oct 07 19:04:56 crc kubenswrapper[4825]: I1007 19:04:56.928557 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8bx2n" podStartSLOduration=2.413295072 podStartE2EDuration="4.928537649s" podCreationTimestamp="2025-10-07 19:04:52 +0000 UTC" firstStartedPulling="2025-10-07 19:04:53.874576728 +0000 UTC m=+282.696615395" lastFinishedPulling="2025-10-07 19:04:56.389819335 +0000 UTC m=+285.211857972" observedRunningTime="2025-10-07 19:04:56.927473525 +0000 UTC m=+285.749512162" watchObservedRunningTime="2025-10-07 19:04:56.928537649 +0000 UTC m=+285.750576276" Oct 07 19:04:57 crc kubenswrapper[4825]: I1007 19:04:57.914708 4825 generic.go:334] "Generic (PLEG): container finished" podID="d4d9fecf-f52d-4758-9a79-9c80afa25e80" containerID="e51ad6ab22d91e95caca785843d1f997a773bb9a64bb512c721b7f74cf126112" exitCode=0 Oct 07 19:04:57 crc kubenswrapper[4825]: I1007 19:04:57.914754 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmrrt" event={"ID":"d4d9fecf-f52d-4758-9a79-9c80afa25e80","Type":"ContainerDied","Data":"e51ad6ab22d91e95caca785843d1f997a773bb9a64bb512c721b7f74cf126112"} Oct 07 19:04:58 crc kubenswrapper[4825]: I1007 19:04:58.922576 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmrrt" event={"ID":"d4d9fecf-f52d-4758-9a79-9c80afa25e80","Type":"ContainerStarted","Data":"04982bf0a7228ee898c9221266e29d3a34ba360bd7c1f770576674048666d09b"} Oct 07 19:04:58 crc kubenswrapper[4825]: I1007 19:04:58.939556 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mmrrt" podStartSLOduration=3.451639712 podStartE2EDuration="5.939525679s" podCreationTimestamp="2025-10-07 19:04:53 +0000 UTC" firstStartedPulling="2025-10-07 19:04:55.898868926 +0000 UTC m=+284.720907563" lastFinishedPulling="2025-10-07 19:04:58.386754893 +0000 UTC m=+287.208793530" observedRunningTime="2025-10-07 19:04:58.938500686 +0000 UTC m=+287.760539363" watchObservedRunningTime="2025-10-07 19:04:58.939525679 +0000 UTC m=+287.761564316" Oct 07 19:05:00 crc kubenswrapper[4825]: I1007 19:05:00.896934 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jzchl" Oct 07 19:05:00 crc kubenswrapper[4825]: I1007 19:05:00.897353 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jzchl" Oct 07 19:05:00 crc kubenswrapper[4825]: I1007 19:05:00.962896 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jzchl" Oct 07 19:05:01 crc kubenswrapper[4825]: I1007 19:05:01.003835 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jzchl" Oct 07 19:05:01 crc kubenswrapper[4825]: I1007 19:05:01.521110 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xq5h6" Oct 07 19:05:01 crc kubenswrapper[4825]: I1007 19:05:01.521202 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xq5h6" Oct 07 19:05:01 crc kubenswrapper[4825]: I1007 19:05:01.579945 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xq5h6" Oct 07 19:05:02 crc kubenswrapper[4825]: I1007 19:05:02.017029 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xq5h6" Oct 07 19:05:03 crc kubenswrapper[4825]: I1007 19:05:03.301678 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8bx2n" Oct 07 19:05:03 crc kubenswrapper[4825]: I1007 19:05:03.301938 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8bx2n" Oct 07 19:05:03 crc kubenswrapper[4825]: I1007 19:05:03.367631 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8bx2n" Oct 07 19:05:03 crc kubenswrapper[4825]: I1007 19:05:03.907071 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mmrrt" Oct 07 19:05:03 crc kubenswrapper[4825]: I1007 19:05:03.907123 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mmrrt" Oct 07 19:05:03 crc kubenswrapper[4825]: I1007 19:05:03.940801 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mmrrt" Oct 07 19:05:03 crc kubenswrapper[4825]: I1007 19:05:03.995639 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8bx2n" Oct 07 19:05:03 crc kubenswrapper[4825]: I1007 19:05:03.999547 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mmrrt" Oct 07 19:06:05 crc kubenswrapper[4825]: I1007 19:06:05.708705 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:06:05 crc kubenswrapper[4825]: I1007 19:06:05.709521 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:06:35 crc kubenswrapper[4825]: I1007 19:06:35.708644 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:06:35 crc kubenswrapper[4825]: I1007 19:06:35.709263 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:07:05 crc kubenswrapper[4825]: I1007 19:07:05.709566 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:07:05 crc kubenswrapper[4825]: I1007 19:07:05.711402 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:07:05 crc kubenswrapper[4825]: I1007 19:07:05.711507 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:07:05 crc kubenswrapper[4825]: I1007 19:07:05.712501 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d59266bf242c50a2596b3ab7b505a4aa50801a6525e38f53609ceb79dca8838b"} pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 19:07:05 crc kubenswrapper[4825]: I1007 19:07:05.712625 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" containerID="cri-o://d59266bf242c50a2596b3ab7b505a4aa50801a6525e38f53609ceb79dca8838b" gracePeriod=600 Oct 07 19:07:06 crc kubenswrapper[4825]: I1007 19:07:06.802900 4825 generic.go:334] "Generic (PLEG): container finished" podID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerID="d59266bf242c50a2596b3ab7b505a4aa50801a6525e38f53609ceb79dca8838b" exitCode=0 Oct 07 19:07:06 crc kubenswrapper[4825]: I1007 19:07:06.802985 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerDied","Data":"d59266bf242c50a2596b3ab7b505a4aa50801a6525e38f53609ceb79dca8838b"} Oct 07 19:07:06 crc kubenswrapper[4825]: I1007 19:07:06.803531 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerStarted","Data":"363a8a3b4b4e09ebede35ca07198927c54a01eb1008f7fc708faf1a573e0f6cc"} Oct 07 19:07:06 crc kubenswrapper[4825]: I1007 19:07:06.803551 4825 scope.go:117] "RemoveContainer" containerID="e76fd45df9f9ed4e41be848b53f6058abd0331e0064031948dbbc070ab7ed954" Oct 07 19:07:19 crc kubenswrapper[4825]: I1007 19:07:19.993599 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tdrhl"] Oct 07 19:07:19 crc kubenswrapper[4825]: I1007 19:07:19.995178 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.057100 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tdrhl"] Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.172075 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2dcd5e14-2097-4453-b429-7aa28be57284-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tdrhl\" (UID: \"2dcd5e14-2097-4453-b429-7aa28be57284\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.172128 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2dcd5e14-2097-4453-b429-7aa28be57284-bound-sa-token\") pod \"image-registry-66df7c8f76-tdrhl\" (UID: \"2dcd5e14-2097-4453-b429-7aa28be57284\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.172150 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2dcd5e14-2097-4453-b429-7aa28be57284-trusted-ca\") pod \"image-registry-66df7c8f76-tdrhl\" (UID: \"2dcd5e14-2097-4453-b429-7aa28be57284\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.172174 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2dcd5e14-2097-4453-b429-7aa28be57284-registry-certificates\") pod \"image-registry-66df7c8f76-tdrhl\" (UID: \"2dcd5e14-2097-4453-b429-7aa28be57284\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.172349 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2dcd5e14-2097-4453-b429-7aa28be57284-registry-tls\") pod \"image-registry-66df7c8f76-tdrhl\" (UID: \"2dcd5e14-2097-4453-b429-7aa28be57284\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.172426 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9cps\" (UniqueName: \"kubernetes.io/projected/2dcd5e14-2097-4453-b429-7aa28be57284-kube-api-access-n9cps\") pod \"image-registry-66df7c8f76-tdrhl\" (UID: \"2dcd5e14-2097-4453-b429-7aa28be57284\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.172503 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2dcd5e14-2097-4453-b429-7aa28be57284-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tdrhl\" (UID: \"2dcd5e14-2097-4453-b429-7aa28be57284\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.172540 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-tdrhl\" (UID: \"2dcd5e14-2097-4453-b429-7aa28be57284\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.203346 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-tdrhl\" (UID: \"2dcd5e14-2097-4453-b429-7aa28be57284\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.273613 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2dcd5e14-2097-4453-b429-7aa28be57284-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tdrhl\" (UID: \"2dcd5e14-2097-4453-b429-7aa28be57284\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.273700 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2dcd5e14-2097-4453-b429-7aa28be57284-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tdrhl\" (UID: \"2dcd5e14-2097-4453-b429-7aa28be57284\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.273728 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2dcd5e14-2097-4453-b429-7aa28be57284-bound-sa-token\") pod \"image-registry-66df7c8f76-tdrhl\" (UID: \"2dcd5e14-2097-4453-b429-7aa28be57284\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.273751 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2dcd5e14-2097-4453-b429-7aa28be57284-trusted-ca\") pod \"image-registry-66df7c8f76-tdrhl\" (UID: \"2dcd5e14-2097-4453-b429-7aa28be57284\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.273781 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2dcd5e14-2097-4453-b429-7aa28be57284-registry-certificates\") pod \"image-registry-66df7c8f76-tdrhl\" (UID: \"2dcd5e14-2097-4453-b429-7aa28be57284\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.273808 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2dcd5e14-2097-4453-b429-7aa28be57284-registry-tls\") pod \"image-registry-66df7c8f76-tdrhl\" (UID: \"2dcd5e14-2097-4453-b429-7aa28be57284\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.273845 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9cps\" (UniqueName: \"kubernetes.io/projected/2dcd5e14-2097-4453-b429-7aa28be57284-kube-api-access-n9cps\") pod \"image-registry-66df7c8f76-tdrhl\" (UID: \"2dcd5e14-2097-4453-b429-7aa28be57284\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.274268 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2dcd5e14-2097-4453-b429-7aa28be57284-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tdrhl\" (UID: \"2dcd5e14-2097-4453-b429-7aa28be57284\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.275608 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2dcd5e14-2097-4453-b429-7aa28be57284-registry-certificates\") pod \"image-registry-66df7c8f76-tdrhl\" (UID: \"2dcd5e14-2097-4453-b429-7aa28be57284\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.276755 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2dcd5e14-2097-4453-b429-7aa28be57284-trusted-ca\") pod \"image-registry-66df7c8f76-tdrhl\" (UID: \"2dcd5e14-2097-4453-b429-7aa28be57284\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.285314 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2dcd5e14-2097-4453-b429-7aa28be57284-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tdrhl\" (UID: \"2dcd5e14-2097-4453-b429-7aa28be57284\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.285313 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2dcd5e14-2097-4453-b429-7aa28be57284-registry-tls\") pod \"image-registry-66df7c8f76-tdrhl\" (UID: \"2dcd5e14-2097-4453-b429-7aa28be57284\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.302063 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9cps\" (UniqueName: \"kubernetes.io/projected/2dcd5e14-2097-4453-b429-7aa28be57284-kube-api-access-n9cps\") pod \"image-registry-66df7c8f76-tdrhl\" (UID: \"2dcd5e14-2097-4453-b429-7aa28be57284\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.304068 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2dcd5e14-2097-4453-b429-7aa28be57284-bound-sa-token\") pod \"image-registry-66df7c8f76-tdrhl\" (UID: \"2dcd5e14-2097-4453-b429-7aa28be57284\") " pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.313972 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.556092 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tdrhl"] Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.902267 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" event={"ID":"2dcd5e14-2097-4453-b429-7aa28be57284","Type":"ContainerStarted","Data":"ebfdd7fcbc2d6d4f1d304006e2233ca8477f9dfb22120ac6260d6710746f4345"} Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.902330 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" event={"ID":"2dcd5e14-2097-4453-b429-7aa28be57284","Type":"ContainerStarted","Data":"a46aac3b355583612a307d741a94f63f2fe3810f71e18a0c93617afab12437f9"} Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.902575 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:20 crc kubenswrapper[4825]: I1007 19:07:20.926139 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" podStartSLOduration=1.9261132600000002 podStartE2EDuration="1.92611326s" podCreationTimestamp="2025-10-07 19:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:07:20.923949401 +0000 UTC m=+429.745988088" watchObservedRunningTime="2025-10-07 19:07:20.92611326 +0000 UTC m=+429.748151927" Oct 07 19:07:40 crc kubenswrapper[4825]: I1007 19:07:40.323116 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-tdrhl" Oct 07 19:07:40 crc kubenswrapper[4825]: I1007 19:07:40.382906 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r2xb4"] Oct 07 19:08:05 crc kubenswrapper[4825]: I1007 19:08:05.435069 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" podUID="a4f51b57-041d-4009-9db3-3579fa7bb84c" containerName="registry" containerID="cri-o://01405172ace623d6ebd13eb0dda25a61de3ea01f92c37b2c4ecf7539271affa2" gracePeriod=30 Oct 07 19:08:05 crc kubenswrapper[4825]: I1007 19:08:05.824926 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:08:05 crc kubenswrapper[4825]: I1007 19:08:05.956706 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a4f51b57-041d-4009-9db3-3579fa7bb84c-registry-tls\") pod \"a4f51b57-041d-4009-9db3-3579fa7bb84c\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " Oct 07 19:08:05 crc kubenswrapper[4825]: I1007 19:08:05.956814 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a4f51b57-041d-4009-9db3-3579fa7bb84c-registry-certificates\") pod \"a4f51b57-041d-4009-9db3-3579fa7bb84c\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " Oct 07 19:08:05 crc kubenswrapper[4825]: I1007 19:08:05.956913 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a4f51b57-041d-4009-9db3-3579fa7bb84c-bound-sa-token\") pod \"a4f51b57-041d-4009-9db3-3579fa7bb84c\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " Oct 07 19:08:05 crc kubenswrapper[4825]: I1007 19:08:05.957133 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a4f51b57-041d-4009-9db3-3579fa7bb84c-installation-pull-secrets\") pod \"a4f51b57-041d-4009-9db3-3579fa7bb84c\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " Oct 07 19:08:05 crc kubenswrapper[4825]: I1007 19:08:05.957200 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a4f51b57-041d-4009-9db3-3579fa7bb84c-ca-trust-extracted\") pod \"a4f51b57-041d-4009-9db3-3579fa7bb84c\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " Oct 07 19:08:05 crc kubenswrapper[4825]: I1007 19:08:05.957521 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"a4f51b57-041d-4009-9db3-3579fa7bb84c\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " Oct 07 19:08:05 crc kubenswrapper[4825]: I1007 19:08:05.957627 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpbxh\" (UniqueName: \"kubernetes.io/projected/a4f51b57-041d-4009-9db3-3579fa7bb84c-kube-api-access-tpbxh\") pod \"a4f51b57-041d-4009-9db3-3579fa7bb84c\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " Oct 07 19:08:05 crc kubenswrapper[4825]: I1007 19:08:05.957836 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4f51b57-041d-4009-9db3-3579fa7bb84c-trusted-ca\") pod \"a4f51b57-041d-4009-9db3-3579fa7bb84c\" (UID: \"a4f51b57-041d-4009-9db3-3579fa7bb84c\") " Oct 07 19:08:05 crc kubenswrapper[4825]: I1007 19:08:05.958135 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4f51b57-041d-4009-9db3-3579fa7bb84c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a4f51b57-041d-4009-9db3-3579fa7bb84c" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:08:05 crc kubenswrapper[4825]: I1007 19:08:05.958558 4825 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a4f51b57-041d-4009-9db3-3579fa7bb84c-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 07 19:08:05 crc kubenswrapper[4825]: I1007 19:08:05.959072 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4f51b57-041d-4009-9db3-3579fa7bb84c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a4f51b57-041d-4009-9db3-3579fa7bb84c" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:08:05 crc kubenswrapper[4825]: I1007 19:08:05.966617 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4f51b57-041d-4009-9db3-3579fa7bb84c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a4f51b57-041d-4009-9db3-3579fa7bb84c" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:08:05 crc kubenswrapper[4825]: I1007 19:08:05.967135 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4f51b57-041d-4009-9db3-3579fa7bb84c-kube-api-access-tpbxh" (OuterVolumeSpecName: "kube-api-access-tpbxh") pod "a4f51b57-041d-4009-9db3-3579fa7bb84c" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c"). InnerVolumeSpecName "kube-api-access-tpbxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:08:05 crc kubenswrapper[4825]: I1007 19:08:05.967389 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f51b57-041d-4009-9db3-3579fa7bb84c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a4f51b57-041d-4009-9db3-3579fa7bb84c" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:08:05 crc kubenswrapper[4825]: I1007 19:08:05.968875 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4f51b57-041d-4009-9db3-3579fa7bb84c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a4f51b57-041d-4009-9db3-3579fa7bb84c" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:08:05 crc kubenswrapper[4825]: I1007 19:08:05.976491 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "a4f51b57-041d-4009-9db3-3579fa7bb84c" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 07 19:08:05 crc kubenswrapper[4825]: I1007 19:08:05.988602 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4f51b57-041d-4009-9db3-3579fa7bb84c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a4f51b57-041d-4009-9db3-3579fa7bb84c" (UID: "a4f51b57-041d-4009-9db3-3579fa7bb84c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:08:06 crc kubenswrapper[4825]: I1007 19:08:06.059644 4825 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a4f51b57-041d-4009-9db3-3579fa7bb84c-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 19:08:06 crc kubenswrapper[4825]: I1007 19:08:06.059695 4825 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a4f51b57-041d-4009-9db3-3579fa7bb84c-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 07 19:08:06 crc kubenswrapper[4825]: I1007 19:08:06.059716 4825 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a4f51b57-041d-4009-9db3-3579fa7bb84c-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 07 19:08:06 crc kubenswrapper[4825]: I1007 19:08:06.059739 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpbxh\" (UniqueName: \"kubernetes.io/projected/a4f51b57-041d-4009-9db3-3579fa7bb84c-kube-api-access-tpbxh\") on node \"crc\" DevicePath \"\"" Oct 07 19:08:06 crc kubenswrapper[4825]: I1007 19:08:06.059759 4825 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4f51b57-041d-4009-9db3-3579fa7bb84c-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 19:08:06 crc kubenswrapper[4825]: I1007 19:08:06.059777 4825 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a4f51b57-041d-4009-9db3-3579fa7bb84c-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 07 19:08:06 crc kubenswrapper[4825]: I1007 19:08:06.210359 4825 generic.go:334] "Generic (PLEG): container finished" podID="a4f51b57-041d-4009-9db3-3579fa7bb84c" containerID="01405172ace623d6ebd13eb0dda25a61de3ea01f92c37b2c4ecf7539271affa2" exitCode=0 Oct 07 19:08:06 crc kubenswrapper[4825]: I1007 19:08:06.210445 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" event={"ID":"a4f51b57-041d-4009-9db3-3579fa7bb84c","Type":"ContainerDied","Data":"01405172ace623d6ebd13eb0dda25a61de3ea01f92c37b2c4ecf7539271affa2"} Oct 07 19:08:06 crc kubenswrapper[4825]: I1007 19:08:06.210493 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" event={"ID":"a4f51b57-041d-4009-9db3-3579fa7bb84c","Type":"ContainerDied","Data":"e524be5ae2150708921b8671de2f5b9d4fc76f23ef54369db6830d7ef2628178"} Oct 07 19:08:06 crc kubenswrapper[4825]: I1007 19:08:06.210530 4825 scope.go:117] "RemoveContainer" containerID="01405172ace623d6ebd13eb0dda25a61de3ea01f92c37b2c4ecf7539271affa2" Oct 07 19:08:06 crc kubenswrapper[4825]: I1007 19:08:06.210949 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-r2xb4" Oct 07 19:08:06 crc kubenswrapper[4825]: I1007 19:08:06.235701 4825 scope.go:117] "RemoveContainer" containerID="01405172ace623d6ebd13eb0dda25a61de3ea01f92c37b2c4ecf7539271affa2" Oct 07 19:08:06 crc kubenswrapper[4825]: E1007 19:08:06.236420 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01405172ace623d6ebd13eb0dda25a61de3ea01f92c37b2c4ecf7539271affa2\": container with ID starting with 01405172ace623d6ebd13eb0dda25a61de3ea01f92c37b2c4ecf7539271affa2 not found: ID does not exist" containerID="01405172ace623d6ebd13eb0dda25a61de3ea01f92c37b2c4ecf7539271affa2" Oct 07 19:08:06 crc kubenswrapper[4825]: I1007 19:08:06.236487 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01405172ace623d6ebd13eb0dda25a61de3ea01f92c37b2c4ecf7539271affa2"} err="failed to get container status \"01405172ace623d6ebd13eb0dda25a61de3ea01f92c37b2c4ecf7539271affa2\": rpc error: code = NotFound desc = could not find container \"01405172ace623d6ebd13eb0dda25a61de3ea01f92c37b2c4ecf7539271affa2\": container with ID starting with 01405172ace623d6ebd13eb0dda25a61de3ea01f92c37b2c4ecf7539271affa2 not found: ID does not exist" Oct 07 19:08:06 crc kubenswrapper[4825]: I1007 19:08:06.269430 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r2xb4"] Oct 07 19:08:06 crc kubenswrapper[4825]: I1007 19:08:06.275589 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r2xb4"] Oct 07 19:08:07 crc kubenswrapper[4825]: I1007 19:08:07.806985 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4f51b57-041d-4009-9db3-3579fa7bb84c" path="/var/lib/kubelet/pods/a4f51b57-041d-4009-9db3-3579fa7bb84c/volumes" Oct 07 19:09:11 crc kubenswrapper[4825]: I1007 19:09:11.921191 4825 scope.go:117] "RemoveContainer" containerID="a746270b14f3a6a793b32395cee50990687647bee292663e37f5274f012ab882" Oct 07 19:09:35 crc kubenswrapper[4825]: I1007 19:09:35.709347 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:09:35 crc kubenswrapper[4825]: I1007 19:09:35.709985 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:10:05 crc kubenswrapper[4825]: I1007 19:10:05.709005 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:10:05 crc kubenswrapper[4825]: I1007 19:10:05.709606 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:10:11 crc kubenswrapper[4825]: I1007 19:10:11.960626 4825 scope.go:117] "RemoveContainer" containerID="d4a13d63b02b29e84d731f5ea381c2e0a2f2f2aead0dcefbd065c14b0a317ca3" Oct 07 19:10:11 crc kubenswrapper[4825]: I1007 19:10:11.981507 4825 scope.go:117] "RemoveContainer" containerID="84b85e890c84d7dffae6e0ef01d0cd172bd26e3773e887b45f4431daa24a1653" Oct 07 19:10:35 crc kubenswrapper[4825]: I1007 19:10:35.708670 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:10:35 crc kubenswrapper[4825]: I1007 19:10:35.709284 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:10:35 crc kubenswrapper[4825]: I1007 19:10:35.709350 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:10:35 crc kubenswrapper[4825]: I1007 19:10:35.710115 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"363a8a3b4b4e09ebede35ca07198927c54a01eb1008f7fc708faf1a573e0f6cc"} pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 19:10:35 crc kubenswrapper[4825]: I1007 19:10:35.710202 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" containerID="cri-o://363a8a3b4b4e09ebede35ca07198927c54a01eb1008f7fc708faf1a573e0f6cc" gracePeriod=600 Oct 07 19:10:36 crc kubenswrapper[4825]: I1007 19:10:36.216267 4825 generic.go:334] "Generic (PLEG): container finished" podID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerID="363a8a3b4b4e09ebede35ca07198927c54a01eb1008f7fc708faf1a573e0f6cc" exitCode=0 Oct 07 19:10:36 crc kubenswrapper[4825]: I1007 19:10:36.216386 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerDied","Data":"363a8a3b4b4e09ebede35ca07198927c54a01eb1008f7fc708faf1a573e0f6cc"} Oct 07 19:10:36 crc kubenswrapper[4825]: I1007 19:10:36.216640 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerStarted","Data":"1aff985c4d465af81432b2c0fd1da9cb01f3378b2087e04530a854de44547a92"} Oct 07 19:10:36 crc kubenswrapper[4825]: I1007 19:10:36.216664 4825 scope.go:117] "RemoveContainer" containerID="d59266bf242c50a2596b3ab7b505a4aa50801a6525e38f53609ceb79dca8838b" Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.225481 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-ff56x"] Oct 07 19:11:31 crc kubenswrapper[4825]: E1007 19:11:31.226276 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4f51b57-041d-4009-9db3-3579fa7bb84c" containerName="registry" Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.226292 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f51b57-041d-4009-9db3-3579fa7bb84c" containerName="registry" Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.226403 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4f51b57-041d-4009-9db3-3579fa7bb84c" containerName="registry" Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.226821 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-ff56x" Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.228672 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.229403 4825 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-9p7cd" Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.229577 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.245567 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-ff56x"] Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.252033 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-vhchg"] Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.258437 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-vhchg" Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.262123 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-ph62w"] Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.263080 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-ph62w" Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.266165 4825 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-hhzx9" Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.266427 4825 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-rb6st" Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.277339 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2qnw\" (UniqueName: \"kubernetes.io/projected/1ff41e8d-639e-4710-a863-1c6dbec99768-kube-api-access-j2qnw\") pod \"cert-manager-5b446d88c5-vhchg\" (UID: \"1ff41e8d-639e-4710-a863-1c6dbec99768\") " pod="cert-manager/cert-manager-5b446d88c5-vhchg" Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.277411 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22grk\" (UniqueName: \"kubernetes.io/projected/491e6da2-5d0d-4a47-abda-467d60d5ec14-kube-api-access-22grk\") pod \"cert-manager-cainjector-7f985d654d-ff56x\" (UID: \"491e6da2-5d0d-4a47-abda-467d60d5ec14\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-ff56x" Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.279974 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq98x\" (UniqueName: \"kubernetes.io/projected/a95caa53-91d1-4d61-872a-c0ff3539d4d7-kube-api-access-xq98x\") pod \"cert-manager-webhook-5655c58dd6-ph62w\" (UID: \"a95caa53-91d1-4d61-872a-c0ff3539d4d7\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-ph62w" Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.287571 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-vhchg"] Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.292738 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-ph62w"] Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.380685 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq98x\" (UniqueName: \"kubernetes.io/projected/a95caa53-91d1-4d61-872a-c0ff3539d4d7-kube-api-access-xq98x\") pod \"cert-manager-webhook-5655c58dd6-ph62w\" (UID: \"a95caa53-91d1-4d61-872a-c0ff3539d4d7\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-ph62w" Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.380812 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2qnw\" (UniqueName: \"kubernetes.io/projected/1ff41e8d-639e-4710-a863-1c6dbec99768-kube-api-access-j2qnw\") pod \"cert-manager-5b446d88c5-vhchg\" (UID: \"1ff41e8d-639e-4710-a863-1c6dbec99768\") " pod="cert-manager/cert-manager-5b446d88c5-vhchg" Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.380849 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22grk\" (UniqueName: \"kubernetes.io/projected/491e6da2-5d0d-4a47-abda-467d60d5ec14-kube-api-access-22grk\") pod \"cert-manager-cainjector-7f985d654d-ff56x\" (UID: \"491e6da2-5d0d-4a47-abda-467d60d5ec14\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-ff56x" Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.398063 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2qnw\" (UniqueName: \"kubernetes.io/projected/1ff41e8d-639e-4710-a863-1c6dbec99768-kube-api-access-j2qnw\") pod \"cert-manager-5b446d88c5-vhchg\" (UID: \"1ff41e8d-639e-4710-a863-1c6dbec99768\") " pod="cert-manager/cert-manager-5b446d88c5-vhchg" Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.398135 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22grk\" (UniqueName: \"kubernetes.io/projected/491e6da2-5d0d-4a47-abda-467d60d5ec14-kube-api-access-22grk\") pod \"cert-manager-cainjector-7f985d654d-ff56x\" (UID: \"491e6da2-5d0d-4a47-abda-467d60d5ec14\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-ff56x" Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.398446 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq98x\" (UniqueName: \"kubernetes.io/projected/a95caa53-91d1-4d61-872a-c0ff3539d4d7-kube-api-access-xq98x\") pod \"cert-manager-webhook-5655c58dd6-ph62w\" (UID: \"a95caa53-91d1-4d61-872a-c0ff3539d4d7\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-ph62w" Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.543544 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-ff56x" Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.590644 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-vhchg" Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.598685 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-ph62w" Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.767243 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-ff56x"] Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.779089 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 19:11:31 crc kubenswrapper[4825]: I1007 19:11:31.845011 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-vhchg"] Oct 07 19:11:31 crc kubenswrapper[4825]: W1007 19:11:31.847953 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ff41e8d_639e_4710_a863_1c6dbec99768.slice/crio-ea295ffa718356b8e7eafdafdf0f8c9591c81ba31c46c2977916822d27f2eace WatchSource:0}: Error finding container ea295ffa718356b8e7eafdafdf0f8c9591c81ba31c46c2977916822d27f2eace: Status 404 returned error can't find the container with id ea295ffa718356b8e7eafdafdf0f8c9591c81ba31c46c2977916822d27f2eace Oct 07 19:11:32 crc kubenswrapper[4825]: I1007 19:11:32.031126 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-ph62w"] Oct 07 19:11:32 crc kubenswrapper[4825]: W1007 19:11:32.035432 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda95caa53_91d1_4d61_872a_c0ff3539d4d7.slice/crio-6598e5462b6d64169f4d2012591e325d210f0f6d09148367432edb6fa7b9ecc6 WatchSource:0}: Error finding container 6598e5462b6d64169f4d2012591e325d210f0f6d09148367432edb6fa7b9ecc6: Status 404 returned error can't find the container with id 6598e5462b6d64169f4d2012591e325d210f0f6d09148367432edb6fa7b9ecc6 Oct 07 19:11:32 crc kubenswrapper[4825]: I1007 19:11:32.607609 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-ff56x" event={"ID":"491e6da2-5d0d-4a47-abda-467d60d5ec14","Type":"ContainerStarted","Data":"bbb54cc72f715ed6ea70e90c1c3d85fccce00319673b25c849bf3f67d7006c7b"} Oct 07 19:11:32 crc kubenswrapper[4825]: I1007 19:11:32.609084 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-vhchg" event={"ID":"1ff41e8d-639e-4710-a863-1c6dbec99768","Type":"ContainerStarted","Data":"ea295ffa718356b8e7eafdafdf0f8c9591c81ba31c46c2977916822d27f2eace"} Oct 07 19:11:32 crc kubenswrapper[4825]: I1007 19:11:32.612783 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-ph62w" event={"ID":"a95caa53-91d1-4d61-872a-c0ff3539d4d7","Type":"ContainerStarted","Data":"6598e5462b6d64169f4d2012591e325d210f0f6d09148367432edb6fa7b9ecc6"} Oct 07 19:11:34 crc kubenswrapper[4825]: I1007 19:11:34.628174 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-ph62w" event={"ID":"a95caa53-91d1-4d61-872a-c0ff3539d4d7","Type":"ContainerStarted","Data":"f689d994d325325dcb6a80a59920cd23462cfc1d7df2539401027eff2a04276b"} Oct 07 19:11:34 crc kubenswrapper[4825]: I1007 19:11:34.629318 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-ph62w" Oct 07 19:11:34 crc kubenswrapper[4825]: I1007 19:11:34.629588 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-ff56x" event={"ID":"491e6da2-5d0d-4a47-abda-467d60d5ec14","Type":"ContainerStarted","Data":"c7df60f9a9ce01256254a8cd538cfbeaf8e008d553015c1c4ced5f932fc52d18"} Oct 07 19:11:34 crc kubenswrapper[4825]: I1007 19:11:34.650884 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-ph62w" podStartSLOduration=1.206267114 podStartE2EDuration="3.6508514s" podCreationTimestamp="2025-10-07 19:11:31 +0000 UTC" firstStartedPulling="2025-10-07 19:11:32.037540979 +0000 UTC m=+680.859579616" lastFinishedPulling="2025-10-07 19:11:34.482125255 +0000 UTC m=+683.304163902" observedRunningTime="2025-10-07 19:11:34.646352216 +0000 UTC m=+683.468390853" watchObservedRunningTime="2025-10-07 19:11:34.6508514 +0000 UTC m=+683.472890037" Oct 07 19:11:34 crc kubenswrapper[4825]: I1007 19:11:34.663607 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-ff56x" podStartSLOduration=1.022713722 podStartE2EDuration="3.66358452s" podCreationTimestamp="2025-10-07 19:11:31 +0000 UTC" firstStartedPulling="2025-10-07 19:11:31.778907273 +0000 UTC m=+680.600945910" lastFinishedPulling="2025-10-07 19:11:34.419778071 +0000 UTC m=+683.241816708" observedRunningTime="2025-10-07 19:11:34.662619259 +0000 UTC m=+683.484657896" watchObservedRunningTime="2025-10-07 19:11:34.66358452 +0000 UTC m=+683.485623157" Oct 07 19:11:35 crc kubenswrapper[4825]: I1007 19:11:35.636698 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-vhchg" event={"ID":"1ff41e8d-639e-4710-a863-1c6dbec99768","Type":"ContainerStarted","Data":"64c997fa15c8bb3ac725aa357a27baae858399d327494af83722c8766f2e704a"} Oct 07 19:11:35 crc kubenswrapper[4825]: I1007 19:11:35.652669 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-vhchg" podStartSLOduration=1.212209495 podStartE2EDuration="4.652650263s" podCreationTimestamp="2025-10-07 19:11:31 +0000 UTC" firstStartedPulling="2025-10-07 19:11:31.851813297 +0000 UTC m=+680.673851934" lastFinishedPulling="2025-10-07 19:11:35.292254055 +0000 UTC m=+684.114292702" observedRunningTime="2025-10-07 19:11:35.651363192 +0000 UTC m=+684.473401869" watchObservedRunningTime="2025-10-07 19:11:35.652650263 +0000 UTC m=+684.474688920" Oct 07 19:11:41 crc kubenswrapper[4825]: I1007 19:11:41.602590 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-ph62w" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.081693 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6lvdm"] Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.083661 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="ovn-controller" containerID="cri-o://913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab" gracePeriod=30 Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.083720 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="nbdb" containerID="cri-o://392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b" gracePeriod=30 Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.083761 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8" gracePeriod=30 Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.083847 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="ovn-acl-logging" containerID="cri-o://a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04" gracePeriod=30 Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.083950 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="northd" containerID="cri-o://f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc" gracePeriod=30 Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.083824 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="kube-rbac-proxy-node" containerID="cri-o://6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2" gracePeriod=30 Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.084061 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="sbdb" containerID="cri-o://f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04" gracePeriod=30 Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.131925 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="ovnkube-controller" containerID="cri-o://4e5cb1687aec0c71724fc5f61de5151d3cc9b7e2dad5ae77d4306a015abd7aeb" gracePeriod=30 Oct 07 19:11:42 crc kubenswrapper[4825]: E1007 19:11:42.528485 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04 is running failed: container process not found" containerID="f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Oct 07 19:11:42 crc kubenswrapper[4825]: E1007 19:11:42.528778 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b is running failed: container process not found" containerID="392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Oct 07 19:11:42 crc kubenswrapper[4825]: E1007 19:11:42.529222 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04 is running failed: container process not found" containerID="f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Oct 07 19:11:42 crc kubenswrapper[4825]: E1007 19:11:42.529539 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b is running failed: container process not found" containerID="392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Oct 07 19:11:42 crc kubenswrapper[4825]: E1007 19:11:42.529788 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04 is running failed: container process not found" containerID="f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Oct 07 19:11:42 crc kubenswrapper[4825]: E1007 19:11:42.529852 4825 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="sbdb" Oct 07 19:11:42 crc kubenswrapper[4825]: E1007 19:11:42.530249 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b is running failed: container process not found" containerID="392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Oct 07 19:11:42 crc kubenswrapper[4825]: E1007 19:11:42.530301 4825 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="nbdb" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.689391 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6lvdm_11546b62-cdda-449d-963e-418c2d4b6e46/ovnkube-controller/3.log" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.694449 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6lvdm_11546b62-cdda-449d-963e-418c2d4b6e46/ovn-acl-logging/0.log" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.695292 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6lvdm_11546b62-cdda-449d-963e-418c2d4b6e46/ovn-controller/0.log" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.695866 4825 generic.go:334] "Generic (PLEG): container finished" podID="11546b62-cdda-449d-963e-418c2d4b6e46" containerID="4e5cb1687aec0c71724fc5f61de5151d3cc9b7e2dad5ae77d4306a015abd7aeb" exitCode=0 Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.695913 4825 generic.go:334] "Generic (PLEG): container finished" podID="11546b62-cdda-449d-963e-418c2d4b6e46" containerID="f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04" exitCode=0 Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.695927 4825 generic.go:334] "Generic (PLEG): container finished" podID="11546b62-cdda-449d-963e-418c2d4b6e46" containerID="392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b" exitCode=0 Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.695944 4825 generic.go:334] "Generic (PLEG): container finished" podID="11546b62-cdda-449d-963e-418c2d4b6e46" containerID="f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc" exitCode=0 Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.695936 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerDied","Data":"4e5cb1687aec0c71724fc5f61de5151d3cc9b7e2dad5ae77d4306a015abd7aeb"} Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.696025 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerDied","Data":"f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04"} Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.696058 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerDied","Data":"392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b"} Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.695960 4825 generic.go:334] "Generic (PLEG): container finished" podID="11546b62-cdda-449d-963e-418c2d4b6e46" containerID="f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8" exitCode=0 Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.696096 4825 generic.go:334] "Generic (PLEG): container finished" podID="11546b62-cdda-449d-963e-418c2d4b6e46" containerID="6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2" exitCode=0 Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.696079 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerDied","Data":"f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc"} Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.696115 4825 generic.go:334] "Generic (PLEG): container finished" podID="11546b62-cdda-449d-963e-418c2d4b6e46" containerID="a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04" exitCode=143 Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.696130 4825 generic.go:334] "Generic (PLEG): container finished" podID="11546b62-cdda-449d-963e-418c2d4b6e46" containerID="913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab" exitCode=143 Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.696133 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerDied","Data":"f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8"} Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.696160 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerDied","Data":"6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2"} Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.696099 4825 scope.go:117] "RemoveContainer" containerID="4f77669353aaa0deb54b8519f6c7a7734f5a44001abcf2bb19baa55fd5c050ff" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.696182 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerDied","Data":"a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04"} Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.696344 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerDied","Data":"913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab"} Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.699303 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zk9x9_44f62e96-26a6-4bfe-8e8c-6884216bd363/kube-multus/2.log" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.704022 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zk9x9_44f62e96-26a6-4bfe-8e8c-6884216bd363/kube-multus/1.log" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.704076 4825 generic.go:334] "Generic (PLEG): container finished" podID="44f62e96-26a6-4bfe-8e8c-6884216bd363" containerID="7b220af5033e5f708bf3bc3586aa956717a4b5f61911848ffc4808a2221bcaa4" exitCode=2 Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.704111 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zk9x9" event={"ID":"44f62e96-26a6-4bfe-8e8c-6884216bd363","Type":"ContainerDied","Data":"7b220af5033e5f708bf3bc3586aa956717a4b5f61911848ffc4808a2221bcaa4"} Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.704676 4825 scope.go:117] "RemoveContainer" containerID="7b220af5033e5f708bf3bc3586aa956717a4b5f61911848ffc4808a2221bcaa4" Oct 07 19:11:42 crc kubenswrapper[4825]: E1007 19:11:42.705027 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-zk9x9_openshift-multus(44f62e96-26a6-4bfe-8e8c-6884216bd363)\"" pod="openshift-multus/multus-zk9x9" podUID="44f62e96-26a6-4bfe-8e8c-6884216bd363" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.746626 4825 scope.go:117] "RemoveContainer" containerID="58e5cbd6853b21641655497f3c250645e7ea086a9dfe7d7e6b941b1cdabc5953" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.788039 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6lvdm_11546b62-cdda-449d-963e-418c2d4b6e46/ovn-acl-logging/0.log" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.788763 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6lvdm_11546b62-cdda-449d-963e-418c2d4b6e46/ovn-controller/0.log" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.789286 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.867847 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-cni-netd\") pod \"11546b62-cdda-449d-963e-418c2d4b6e46\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.868428 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-run-ovn\") pod \"11546b62-cdda-449d-963e-418c2d4b6e46\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.868469 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/11546b62-cdda-449d-963e-418c2d4b6e46-ovnkube-script-lib\") pod \"11546b62-cdda-449d-963e-418c2d4b6e46\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.868509 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-etc-openvswitch\") pod \"11546b62-cdda-449d-963e-418c2d4b6e46\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.868557 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-kubelet\") pod \"11546b62-cdda-449d-963e-418c2d4b6e46\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.868606 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/11546b62-cdda-449d-963e-418c2d4b6e46-env-overrides\") pod \"11546b62-cdda-449d-963e-418c2d4b6e46\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.868641 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/11546b62-cdda-449d-963e-418c2d4b6e46-ovn-node-metrics-cert\") pod \"11546b62-cdda-449d-963e-418c2d4b6e46\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.868680 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-run-systemd\") pod \"11546b62-cdda-449d-963e-418c2d4b6e46\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.869389 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/11546b62-cdda-449d-963e-418c2d4b6e46-ovnkube-config\") pod \"11546b62-cdda-449d-963e-418c2d4b6e46\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.869437 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-node-log\") pod \"11546b62-cdda-449d-963e-418c2d4b6e46\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.869474 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-systemd-units\") pod \"11546b62-cdda-449d-963e-418c2d4b6e46\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.869500 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-run-ovn-kubernetes\") pod \"11546b62-cdda-449d-963e-418c2d4b6e46\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.869524 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-var-lib-openvswitch\") pod \"11546b62-cdda-449d-963e-418c2d4b6e46\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.869588 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-run-netns\") pod \"11546b62-cdda-449d-963e-418c2d4b6e46\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.869628 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-log-socket\") pod \"11546b62-cdda-449d-963e-418c2d4b6e46\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.870799 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-slash\") pod \"11546b62-cdda-449d-963e-418c2d4b6e46\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.870832 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-var-lib-cni-networks-ovn-kubernetes\") pod \"11546b62-cdda-449d-963e-418c2d4b6e46\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.870861 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-run-openvswitch\") pod \"11546b62-cdda-449d-963e-418c2d4b6e46\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.870903 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmmv8\" (UniqueName: \"kubernetes.io/projected/11546b62-cdda-449d-963e-418c2d4b6e46-kube-api-access-qmmv8\") pod \"11546b62-cdda-449d-963e-418c2d4b6e46\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.870946 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-cni-bin\") pod \"11546b62-cdda-449d-963e-418c2d4b6e46\" (UID: \"11546b62-cdda-449d-963e-418c2d4b6e46\") " Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.871485 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "11546b62-cdda-449d-963e-418c2d4b6e46" (UID: "11546b62-cdda-449d-963e-418c2d4b6e46"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.871538 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "11546b62-cdda-449d-963e-418c2d4b6e46" (UID: "11546b62-cdda-449d-963e-418c2d4b6e46"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.871561 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "11546b62-cdda-449d-963e-418c2d4b6e46" (UID: "11546b62-cdda-449d-963e-418c2d4b6e46"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.871554 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-node-log" (OuterVolumeSpecName: "node-log") pod "11546b62-cdda-449d-963e-418c2d4b6e46" (UID: "11546b62-cdda-449d-963e-418c2d4b6e46"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.871605 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "11546b62-cdda-449d-963e-418c2d4b6e46" (UID: "11546b62-cdda-449d-963e-418c2d4b6e46"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.871604 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "11546b62-cdda-449d-963e-418c2d4b6e46" (UID: "11546b62-cdda-449d-963e-418c2d4b6e46"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.871654 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "11546b62-cdda-449d-963e-418c2d4b6e46" (UID: "11546b62-cdda-449d-963e-418c2d4b6e46"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.871641 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "11546b62-cdda-449d-963e-418c2d4b6e46" (UID: "11546b62-cdda-449d-963e-418c2d4b6e46"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.871686 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "11546b62-cdda-449d-963e-418c2d4b6e46" (UID: "11546b62-cdda-449d-963e-418c2d4b6e46"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.871707 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-slash" (OuterVolumeSpecName: "host-slash") pod "11546b62-cdda-449d-963e-418c2d4b6e46" (UID: "11546b62-cdda-449d-963e-418c2d4b6e46"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.871920 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-log-socket" (OuterVolumeSpecName: "log-socket") pod "11546b62-cdda-449d-963e-418c2d4b6e46" (UID: "11546b62-cdda-449d-963e-418c2d4b6e46"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.872140 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "11546b62-cdda-449d-963e-418c2d4b6e46" (UID: "11546b62-cdda-449d-963e-418c2d4b6e46"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.872174 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "11546b62-cdda-449d-963e-418c2d4b6e46" (UID: "11546b62-cdda-449d-963e-418c2d4b6e46"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.872272 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "11546b62-cdda-449d-963e-418c2d4b6e46" (UID: "11546b62-cdda-449d-963e-418c2d4b6e46"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.873558 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11546b62-cdda-449d-963e-418c2d4b6e46-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "11546b62-cdda-449d-963e-418c2d4b6e46" (UID: "11546b62-cdda-449d-963e-418c2d4b6e46"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.874001 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11546b62-cdda-449d-963e-418c2d4b6e46-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "11546b62-cdda-449d-963e-418c2d4b6e46" (UID: "11546b62-cdda-449d-963e-418c2d4b6e46"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.874694 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11546b62-cdda-449d-963e-418c2d4b6e46-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "11546b62-cdda-449d-963e-418c2d4b6e46" (UID: "11546b62-cdda-449d-963e-418c2d4b6e46"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.876606 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fbwwz"] Oct 07 19:11:42 crc kubenswrapper[4825]: E1007 19:11:42.876946 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="kubecfg-setup" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.876974 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="kubecfg-setup" Oct 07 19:11:42 crc kubenswrapper[4825]: E1007 19:11:42.877005 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="ovn-acl-logging" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.877019 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="ovn-acl-logging" Oct 07 19:11:42 crc kubenswrapper[4825]: E1007 19:11:42.877039 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="ovnkube-controller" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.877053 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="ovnkube-controller" Oct 07 19:11:42 crc kubenswrapper[4825]: E1007 19:11:42.877071 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="ovnkube-controller" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.877084 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="ovnkube-controller" Oct 07 19:11:42 crc kubenswrapper[4825]: E1007 19:11:42.877103 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="ovn-controller" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.877116 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="ovn-controller" Oct 07 19:11:42 crc kubenswrapper[4825]: E1007 19:11:42.877135 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="sbdb" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.877147 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="sbdb" Oct 07 19:11:42 crc kubenswrapper[4825]: E1007 19:11:42.877162 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="nbdb" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.877217 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="nbdb" Oct 07 19:11:42 crc kubenswrapper[4825]: E1007 19:11:42.877263 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="ovnkube-controller" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.877277 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="ovnkube-controller" Oct 07 19:11:42 crc kubenswrapper[4825]: E1007 19:11:42.877292 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="kube-rbac-proxy-node" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.877305 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="kube-rbac-proxy-node" Oct 07 19:11:42 crc kubenswrapper[4825]: E1007 19:11:42.877325 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="ovnkube-controller" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.877338 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="ovnkube-controller" Oct 07 19:11:42 crc kubenswrapper[4825]: E1007 19:11:42.877358 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="ovnkube-controller" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.877372 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="ovnkube-controller" Oct 07 19:11:42 crc kubenswrapper[4825]: E1007 19:11:42.877389 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="northd" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.877403 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="northd" Oct 07 19:11:42 crc kubenswrapper[4825]: E1007 19:11:42.877421 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="kube-rbac-proxy-ovn-metrics" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.877437 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="kube-rbac-proxy-ovn-metrics" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.877602 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="ovn-acl-logging" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.877619 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="kube-rbac-proxy-node" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.877638 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="ovnkube-controller" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.877655 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="ovnkube-controller" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.877669 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="ovnkube-controller" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.877683 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="kube-rbac-proxy-ovn-metrics" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.877700 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="ovn-controller" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.877719 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="nbdb" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.877734 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="northd" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.877753 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="sbdb" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.878101 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="ovnkube-controller" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.878116 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" containerName="ovnkube-controller" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.879678 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11546b62-cdda-449d-963e-418c2d4b6e46-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "11546b62-cdda-449d-963e-418c2d4b6e46" (UID: "11546b62-cdda-449d-963e-418c2d4b6e46"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.880140 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11546b62-cdda-449d-963e-418c2d4b6e46-kube-api-access-qmmv8" (OuterVolumeSpecName: "kube-api-access-qmmv8") pod "11546b62-cdda-449d-963e-418c2d4b6e46" (UID: "11546b62-cdda-449d-963e-418c2d4b6e46"). InnerVolumeSpecName "kube-api-access-qmmv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.881121 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.892200 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "11546b62-cdda-449d-963e-418c2d4b6e46" (UID: "11546b62-cdda-449d-963e-418c2d4b6e46"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.973452 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-host-run-netns\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.973537 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b869c622-4bb1-4030-b29d-a422cc65d73e-env-overrides\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.973653 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-run-ovn\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.973842 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-node-log\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.973919 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-host-run-ovn-kubernetes\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.973985 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-run-openvswitch\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.974016 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-host-kubelet\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.974079 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb7q7\" (UniqueName: \"kubernetes.io/projected/b869c622-4bb1-4030-b29d-a422cc65d73e-kube-api-access-jb7q7\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.974117 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-systemd-units\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.974149 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-log-socket\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.974215 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b869c622-4bb1-4030-b29d-a422cc65d73e-ovnkube-config\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.974302 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b869c622-4bb1-4030-b29d-a422cc65d73e-ovnkube-script-lib\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.974377 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-var-lib-openvswitch\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.974414 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.974445 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-etc-openvswitch\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.974557 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-run-systemd\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.974633 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-host-cni-bin\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.974660 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-host-cni-netd\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.974715 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b869c622-4bb1-4030-b29d-a422cc65d73e-ovn-node-metrics-cert\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.974781 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-host-slash\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.974986 4825 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.975020 4825 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/11546b62-cdda-449d-963e-418c2d4b6e46-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.975044 4825 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.975064 4825 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.975085 4825 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/11546b62-cdda-449d-963e-418c2d4b6e46-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.975104 4825 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/11546b62-cdda-449d-963e-418c2d4b6e46-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.975121 4825 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.975138 4825 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/11546b62-cdda-449d-963e-418c2d4b6e46-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.975155 4825 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-node-log\") on node \"crc\" DevicePath \"\"" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.975172 4825 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.975190 4825 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.975207 4825 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.975272 4825 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.975291 4825 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-log-socket\") on node \"crc\" DevicePath \"\"" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.975309 4825 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.975326 4825 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-slash\") on node \"crc\" DevicePath \"\"" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.975345 4825 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.975363 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmmv8\" (UniqueName: \"kubernetes.io/projected/11546b62-cdda-449d-963e-418c2d4b6e46-kube-api-access-qmmv8\") on node \"crc\" DevicePath \"\"" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.975381 4825 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 07 19:11:42 crc kubenswrapper[4825]: I1007 19:11:42.975396 4825 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/11546b62-cdda-449d-963e-418c2d4b6e46-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.077018 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-host-run-netns\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.077108 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b869c622-4bb1-4030-b29d-a422cc65d73e-env-overrides\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.077147 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-host-run-netns\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.077168 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-run-ovn\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.077278 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-run-ovn\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.077329 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-node-log\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.077375 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-host-run-ovn-kubernetes\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.077417 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-run-openvswitch\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.077450 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-host-kubelet\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.077484 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-node-log\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.077521 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb7q7\" (UniqueName: \"kubernetes.io/projected/b869c622-4bb1-4030-b29d-a422cc65d73e-kube-api-access-jb7q7\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.077571 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-host-kubelet\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.077561 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-host-run-ovn-kubernetes\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.077596 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-systemd-units\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.077651 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-systemd-units\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.077671 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-log-socket\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.077593 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-run-openvswitch\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.077716 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b869c622-4bb1-4030-b29d-a422cc65d73e-ovnkube-config\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.077765 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-log-socket\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.077829 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b869c622-4bb1-4030-b29d-a422cc65d73e-ovnkube-script-lib\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.077926 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-var-lib-openvswitch\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.077981 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.078041 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-etc-openvswitch\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.078089 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-run-systemd\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.078084 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-var-lib-openvswitch\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.078105 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.078113 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-etc-openvswitch\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.078196 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-host-cni-netd\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.078261 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-run-systemd\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.078285 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-host-cni-bin\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.078323 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-host-cni-bin\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.078285 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-host-cni-netd\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.078367 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b869c622-4bb1-4030-b29d-a422cc65d73e-ovn-node-metrics-cert\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.078434 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-host-slash\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.078490 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b869c622-4bb1-4030-b29d-a422cc65d73e-host-slash\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.079422 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b869c622-4bb1-4030-b29d-a422cc65d73e-ovnkube-config\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.080552 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b869c622-4bb1-4030-b29d-a422cc65d73e-env-overrides\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.080787 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b869c622-4bb1-4030-b29d-a422cc65d73e-ovnkube-script-lib\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.095148 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b869c622-4bb1-4030-b29d-a422cc65d73e-ovn-node-metrics-cert\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.122049 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb7q7\" (UniqueName: \"kubernetes.io/projected/b869c622-4bb1-4030-b29d-a422cc65d73e-kube-api-access-jb7q7\") pod \"ovnkube-node-fbwwz\" (UID: \"b869c622-4bb1-4030-b29d-a422cc65d73e\") " pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.222321 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.719466 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6lvdm_11546b62-cdda-449d-963e-418c2d4b6e46/ovn-acl-logging/0.log" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.720485 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6lvdm_11546b62-cdda-449d-963e-418c2d4b6e46/ovn-controller/0.log" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.721486 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" event={"ID":"11546b62-cdda-449d-963e-418c2d4b6e46","Type":"ContainerDied","Data":"a1ebad1f97a9efe415c351baad2fd4e11338df5fb533ee22b295891820bc5a21"} Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.721589 4825 scope.go:117] "RemoveContainer" containerID="4e5cb1687aec0c71724fc5f61de5151d3cc9b7e2dad5ae77d4306a015abd7aeb" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.721522 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6lvdm" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.724384 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zk9x9_44f62e96-26a6-4bfe-8e8c-6884216bd363/kube-multus/2.log" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.731393 4825 generic.go:334] "Generic (PLEG): container finished" podID="b869c622-4bb1-4030-b29d-a422cc65d73e" containerID="5645eb844b2089c9c5f106483057fe9c9f339b8903a7607eeb2351ee02379c9b" exitCode=0 Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.731469 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" event={"ID":"b869c622-4bb1-4030-b29d-a422cc65d73e","Type":"ContainerDied","Data":"5645eb844b2089c9c5f106483057fe9c9f339b8903a7607eeb2351ee02379c9b"} Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.731527 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" event={"ID":"b869c622-4bb1-4030-b29d-a422cc65d73e","Type":"ContainerStarted","Data":"27fdb8c3ae19772187055041c046aa78cfd4759ebb914e7f912475cf48a1cb0a"} Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.751112 4825 scope.go:117] "RemoveContainer" containerID="f7d43c3a8075d9bca039adaa310284209fe56d19b70f45cc73b24cf1d5b79a04" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.781680 4825 scope.go:117] "RemoveContainer" containerID="392f46b434d836910e9efc4557d2293789a98766c11fd515ce030be9d0af852b" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.816368 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6lvdm"] Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.820178 4825 scope.go:117] "RemoveContainer" containerID="f6fb4f3ce2ca4b5783d5731f662d9ee920e8845cf75325ac807dd5ed8c38c8cc" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.823183 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6lvdm"] Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.875013 4825 scope.go:117] "RemoveContainer" containerID="f829f3b934af52a376471c608a0ee9ec281fe8f200d0829b6edfa03461b055c8" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.895291 4825 scope.go:117] "RemoveContainer" containerID="6ed8080026f7a33d2f3168d6608277801e9525ce49dbce505b95715bc1a6adb2" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.914815 4825 scope.go:117] "RemoveContainer" containerID="a2a30bde4aee6c22f6579941d9d91fdb9874bcbb3112ff03ed943fd1c143ac04" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.937287 4825 scope.go:117] "RemoveContainer" containerID="913af77480319fc17e1cf057dc369eb640caa89a8e538438fa031368e9504bab" Oct 07 19:11:43 crc kubenswrapper[4825]: I1007 19:11:43.959613 4825 scope.go:117] "RemoveContainer" containerID="7ea082d9663f1e196f3378294f88ec9183b57cd0360383eb2c360dc0a7494b4c" Oct 07 19:11:44 crc kubenswrapper[4825]: I1007 19:11:44.750656 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" event={"ID":"b869c622-4bb1-4030-b29d-a422cc65d73e","Type":"ContainerStarted","Data":"82e3128d23e31d7528edd652f02c2e53b3f624d96abf81a24c105f28834cd801"} Oct 07 19:11:44 crc kubenswrapper[4825]: I1007 19:11:44.751049 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" event={"ID":"b869c622-4bb1-4030-b29d-a422cc65d73e","Type":"ContainerStarted","Data":"c9193c0b083a70946abad2a285232eac59fc0b973665903c7a366135163b4344"} Oct 07 19:11:44 crc kubenswrapper[4825]: I1007 19:11:44.751103 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" event={"ID":"b869c622-4bb1-4030-b29d-a422cc65d73e","Type":"ContainerStarted","Data":"d7f3de8338bafaccd5b41a24b75614ae12ad20cd45ffc1a38a3fa37c03f78679"} Oct 07 19:11:44 crc kubenswrapper[4825]: I1007 19:11:44.751121 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" event={"ID":"b869c622-4bb1-4030-b29d-a422cc65d73e","Type":"ContainerStarted","Data":"d99cdfb88c53a4ffbad56ed780e1cd4a822fe5d35f4d2bb0169d6152f5620592"} Oct 07 19:11:44 crc kubenswrapper[4825]: I1007 19:11:44.751136 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" event={"ID":"b869c622-4bb1-4030-b29d-a422cc65d73e","Type":"ContainerStarted","Data":"b6b40f57bab83b6045f20a422b59a4ed50fdf6d102fdcb794f3228e763351667"} Oct 07 19:11:44 crc kubenswrapper[4825]: I1007 19:11:44.751157 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" event={"ID":"b869c622-4bb1-4030-b29d-a422cc65d73e","Type":"ContainerStarted","Data":"120743585fe64d13445c89480b9b181723d686b8d84ce9903bd9ffabb25a6f1c"} Oct 07 19:11:45 crc kubenswrapper[4825]: I1007 19:11:45.806191 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11546b62-cdda-449d-963e-418c2d4b6e46" path="/var/lib/kubelet/pods/11546b62-cdda-449d-963e-418c2d4b6e46/volumes" Oct 07 19:11:47 crc kubenswrapper[4825]: I1007 19:11:47.779463 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" event={"ID":"b869c622-4bb1-4030-b29d-a422cc65d73e","Type":"ContainerStarted","Data":"5032423af5e5d9fb99911133773552924a25325d275d87d7d5e8ad27e2b46dac"} Oct 07 19:11:49 crc kubenswrapper[4825]: I1007 19:11:49.808129 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:49 crc kubenswrapper[4825]: I1007 19:11:49.808912 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" event={"ID":"b869c622-4bb1-4030-b29d-a422cc65d73e","Type":"ContainerStarted","Data":"10638e47300b0840cb5a1efbc3fed85fd1fcb1af7c51eee29ebe9a6c01ef70dc"} Oct 07 19:11:49 crc kubenswrapper[4825]: I1007 19:11:49.845172 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:49 crc kubenswrapper[4825]: I1007 19:11:49.849597 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" podStartSLOduration=7.849576488 podStartE2EDuration="7.849576488s" podCreationTimestamp="2025-10-07 19:11:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:11:49.848208664 +0000 UTC m=+698.670247301" watchObservedRunningTime="2025-10-07 19:11:49.849576488 +0000 UTC m=+698.671615145" Oct 07 19:11:50 crc kubenswrapper[4825]: I1007 19:11:50.806090 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:50 crc kubenswrapper[4825]: I1007 19:11:50.806154 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:50 crc kubenswrapper[4825]: I1007 19:11:50.840458 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:11:56 crc kubenswrapper[4825]: I1007 19:11:56.796034 4825 scope.go:117] "RemoveContainer" containerID="7b220af5033e5f708bf3bc3586aa956717a4b5f61911848ffc4808a2221bcaa4" Oct 07 19:11:56 crc kubenswrapper[4825]: E1007 19:11:56.797758 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-zk9x9_openshift-multus(44f62e96-26a6-4bfe-8e8c-6884216bd363)\"" pod="openshift-multus/multus-zk9x9" podUID="44f62e96-26a6-4bfe-8e8c-6884216bd363" Oct 07 19:12:07 crc kubenswrapper[4825]: I1007 19:12:07.795405 4825 scope.go:117] "RemoveContainer" containerID="7b220af5033e5f708bf3bc3586aa956717a4b5f61911848ffc4808a2221bcaa4" Oct 07 19:12:08 crc kubenswrapper[4825]: I1007 19:12:08.925895 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zk9x9_44f62e96-26a6-4bfe-8e8c-6884216bd363/kube-multus/2.log" Oct 07 19:12:08 crc kubenswrapper[4825]: I1007 19:12:08.926425 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zk9x9" event={"ID":"44f62e96-26a6-4bfe-8e8c-6884216bd363","Type":"ContainerStarted","Data":"b27b54ba650d3bf9b6dd9786b3442b1b93fd389652d5d3bc0d7151e44c74781e"} Oct 07 19:12:13 crc kubenswrapper[4825]: I1007 19:12:13.248503 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fbwwz" Oct 07 19:12:18 crc kubenswrapper[4825]: I1007 19:12:18.797828 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4"] Oct 07 19:12:18 crc kubenswrapper[4825]: I1007 19:12:18.800061 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4" Oct 07 19:12:18 crc kubenswrapper[4825]: I1007 19:12:18.803071 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 07 19:12:18 crc kubenswrapper[4825]: I1007 19:12:18.814482 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4"] Oct 07 19:12:18 crc kubenswrapper[4825]: I1007 19:12:18.920382 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b2c30cb-8398-4238-a5cd-eb2ee78812a1-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4\" (UID: \"3b2c30cb-8398-4238-a5cd-eb2ee78812a1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4" Oct 07 19:12:18 crc kubenswrapper[4825]: I1007 19:12:18.920512 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5dhs\" (UniqueName: \"kubernetes.io/projected/3b2c30cb-8398-4238-a5cd-eb2ee78812a1-kube-api-access-f5dhs\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4\" (UID: \"3b2c30cb-8398-4238-a5cd-eb2ee78812a1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4" Oct 07 19:12:18 crc kubenswrapper[4825]: I1007 19:12:18.920670 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b2c30cb-8398-4238-a5cd-eb2ee78812a1-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4\" (UID: \"3b2c30cb-8398-4238-a5cd-eb2ee78812a1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4" Oct 07 19:12:19 crc kubenswrapper[4825]: I1007 19:12:19.022603 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b2c30cb-8398-4238-a5cd-eb2ee78812a1-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4\" (UID: \"3b2c30cb-8398-4238-a5cd-eb2ee78812a1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4" Oct 07 19:12:19 crc kubenswrapper[4825]: I1007 19:12:19.022699 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5dhs\" (UniqueName: \"kubernetes.io/projected/3b2c30cb-8398-4238-a5cd-eb2ee78812a1-kube-api-access-f5dhs\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4\" (UID: \"3b2c30cb-8398-4238-a5cd-eb2ee78812a1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4" Oct 07 19:12:19 crc kubenswrapper[4825]: I1007 19:12:19.022768 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b2c30cb-8398-4238-a5cd-eb2ee78812a1-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4\" (UID: \"3b2c30cb-8398-4238-a5cd-eb2ee78812a1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4" Oct 07 19:12:19 crc kubenswrapper[4825]: I1007 19:12:19.023558 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b2c30cb-8398-4238-a5cd-eb2ee78812a1-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4\" (UID: \"3b2c30cb-8398-4238-a5cd-eb2ee78812a1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4" Oct 07 19:12:19 crc kubenswrapper[4825]: I1007 19:12:19.023796 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b2c30cb-8398-4238-a5cd-eb2ee78812a1-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4\" (UID: \"3b2c30cb-8398-4238-a5cd-eb2ee78812a1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4" Oct 07 19:12:19 crc kubenswrapper[4825]: I1007 19:12:19.057088 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5dhs\" (UniqueName: \"kubernetes.io/projected/3b2c30cb-8398-4238-a5cd-eb2ee78812a1-kube-api-access-f5dhs\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4\" (UID: \"3b2c30cb-8398-4238-a5cd-eb2ee78812a1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4" Oct 07 19:12:19 crc kubenswrapper[4825]: I1007 19:12:19.133060 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4" Oct 07 19:12:19 crc kubenswrapper[4825]: I1007 19:12:19.420195 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4"] Oct 07 19:12:20 crc kubenswrapper[4825]: I1007 19:12:20.005130 4825 generic.go:334] "Generic (PLEG): container finished" podID="3b2c30cb-8398-4238-a5cd-eb2ee78812a1" containerID="32af9e99bf515f318e062a13dc2e1808003bf67fdb4ca69378f441421fe77c7a" exitCode=0 Oct 07 19:12:20 crc kubenswrapper[4825]: I1007 19:12:20.005193 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4" event={"ID":"3b2c30cb-8398-4238-a5cd-eb2ee78812a1","Type":"ContainerDied","Data":"32af9e99bf515f318e062a13dc2e1808003bf67fdb4ca69378f441421fe77c7a"} Oct 07 19:12:20 crc kubenswrapper[4825]: I1007 19:12:20.005265 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4" event={"ID":"3b2c30cb-8398-4238-a5cd-eb2ee78812a1","Type":"ContainerStarted","Data":"44ffa6f449966ed0c98b4c2bb1b3404f8c28b507942015d714b16831d6d43879"} Oct 07 19:12:22 crc kubenswrapper[4825]: I1007 19:12:22.019147 4825 generic.go:334] "Generic (PLEG): container finished" podID="3b2c30cb-8398-4238-a5cd-eb2ee78812a1" containerID="bec63d1403c827998cd1aa72fa9f9a02a8375085405a94451c2c581fa4946ec7" exitCode=0 Oct 07 19:12:22 crc kubenswrapper[4825]: I1007 19:12:22.019245 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4" event={"ID":"3b2c30cb-8398-4238-a5cd-eb2ee78812a1","Type":"ContainerDied","Data":"bec63d1403c827998cd1aa72fa9f9a02a8375085405a94451c2c581fa4946ec7"} Oct 07 19:12:23 crc kubenswrapper[4825]: I1007 19:12:23.029196 4825 generic.go:334] "Generic (PLEG): container finished" podID="3b2c30cb-8398-4238-a5cd-eb2ee78812a1" containerID="cc8fc031eaf786a1a9febe2017c15771ce5afed658355f94fa37a79d9199a1f2" exitCode=0 Oct 07 19:12:23 crc kubenswrapper[4825]: I1007 19:12:23.029298 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4" event={"ID":"3b2c30cb-8398-4238-a5cd-eb2ee78812a1","Type":"ContainerDied","Data":"cc8fc031eaf786a1a9febe2017c15771ce5afed658355f94fa37a79d9199a1f2"} Oct 07 19:12:24 crc kubenswrapper[4825]: I1007 19:12:24.394597 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4" Oct 07 19:12:24 crc kubenswrapper[4825]: I1007 19:12:24.508759 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5dhs\" (UniqueName: \"kubernetes.io/projected/3b2c30cb-8398-4238-a5cd-eb2ee78812a1-kube-api-access-f5dhs\") pod \"3b2c30cb-8398-4238-a5cd-eb2ee78812a1\" (UID: \"3b2c30cb-8398-4238-a5cd-eb2ee78812a1\") " Oct 07 19:12:24 crc kubenswrapper[4825]: I1007 19:12:24.508976 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b2c30cb-8398-4238-a5cd-eb2ee78812a1-util\") pod \"3b2c30cb-8398-4238-a5cd-eb2ee78812a1\" (UID: \"3b2c30cb-8398-4238-a5cd-eb2ee78812a1\") " Oct 07 19:12:24 crc kubenswrapper[4825]: I1007 19:12:24.509201 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b2c30cb-8398-4238-a5cd-eb2ee78812a1-bundle\") pod \"3b2c30cb-8398-4238-a5cd-eb2ee78812a1\" (UID: \"3b2c30cb-8398-4238-a5cd-eb2ee78812a1\") " Oct 07 19:12:24 crc kubenswrapper[4825]: I1007 19:12:24.510318 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b2c30cb-8398-4238-a5cd-eb2ee78812a1-bundle" (OuterVolumeSpecName: "bundle") pod "3b2c30cb-8398-4238-a5cd-eb2ee78812a1" (UID: "3b2c30cb-8398-4238-a5cd-eb2ee78812a1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:12:24 crc kubenswrapper[4825]: I1007 19:12:24.519471 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b2c30cb-8398-4238-a5cd-eb2ee78812a1-kube-api-access-f5dhs" (OuterVolumeSpecName: "kube-api-access-f5dhs") pod "3b2c30cb-8398-4238-a5cd-eb2ee78812a1" (UID: "3b2c30cb-8398-4238-a5cd-eb2ee78812a1"). InnerVolumeSpecName "kube-api-access-f5dhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:12:24 crc kubenswrapper[4825]: I1007 19:12:24.526069 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b2c30cb-8398-4238-a5cd-eb2ee78812a1-util" (OuterVolumeSpecName: "util") pod "3b2c30cb-8398-4238-a5cd-eb2ee78812a1" (UID: "3b2c30cb-8398-4238-a5cd-eb2ee78812a1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:12:24 crc kubenswrapper[4825]: I1007 19:12:24.611603 4825 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b2c30cb-8398-4238-a5cd-eb2ee78812a1-util\") on node \"crc\" DevicePath \"\"" Oct 07 19:12:24 crc kubenswrapper[4825]: I1007 19:12:24.611648 4825 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b2c30cb-8398-4238-a5cd-eb2ee78812a1-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:12:24 crc kubenswrapper[4825]: I1007 19:12:24.611661 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5dhs\" (UniqueName: \"kubernetes.io/projected/3b2c30cb-8398-4238-a5cd-eb2ee78812a1-kube-api-access-f5dhs\") on node \"crc\" DevicePath \"\"" Oct 07 19:12:25 crc kubenswrapper[4825]: I1007 19:12:25.045961 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4" event={"ID":"3b2c30cb-8398-4238-a5cd-eb2ee78812a1","Type":"ContainerDied","Data":"44ffa6f449966ed0c98b4c2bb1b3404f8c28b507942015d714b16831d6d43879"} Oct 07 19:12:25 crc kubenswrapper[4825]: I1007 19:12:25.046026 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44ffa6f449966ed0c98b4c2bb1b3404f8c28b507942015d714b16831d6d43879" Oct 07 19:12:25 crc kubenswrapper[4825]: I1007 19:12:25.046067 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4" Oct 07 19:12:27 crc kubenswrapper[4825]: I1007 19:12:27.510085 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-8lh75"] Oct 07 19:12:27 crc kubenswrapper[4825]: E1007 19:12:27.510366 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b2c30cb-8398-4238-a5cd-eb2ee78812a1" containerName="extract" Oct 07 19:12:27 crc kubenswrapper[4825]: I1007 19:12:27.510383 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b2c30cb-8398-4238-a5cd-eb2ee78812a1" containerName="extract" Oct 07 19:12:27 crc kubenswrapper[4825]: E1007 19:12:27.510399 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b2c30cb-8398-4238-a5cd-eb2ee78812a1" containerName="util" Oct 07 19:12:27 crc kubenswrapper[4825]: I1007 19:12:27.510406 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b2c30cb-8398-4238-a5cd-eb2ee78812a1" containerName="util" Oct 07 19:12:27 crc kubenswrapper[4825]: E1007 19:12:27.510417 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b2c30cb-8398-4238-a5cd-eb2ee78812a1" containerName="pull" Oct 07 19:12:27 crc kubenswrapper[4825]: I1007 19:12:27.510426 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b2c30cb-8398-4238-a5cd-eb2ee78812a1" containerName="pull" Oct 07 19:12:27 crc kubenswrapper[4825]: I1007 19:12:27.510568 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b2c30cb-8398-4238-a5cd-eb2ee78812a1" containerName="extract" Oct 07 19:12:27 crc kubenswrapper[4825]: I1007 19:12:27.511015 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-8lh75" Oct 07 19:12:27 crc kubenswrapper[4825]: I1007 19:12:27.517452 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 07 19:12:27 crc kubenswrapper[4825]: I1007 19:12:27.518954 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 07 19:12:27 crc kubenswrapper[4825]: I1007 19:12:27.525584 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-77vpp" Oct 07 19:12:27 crc kubenswrapper[4825]: I1007 19:12:27.525820 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-8lh75"] Oct 07 19:12:27 crc kubenswrapper[4825]: I1007 19:12:27.549398 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-278cs\" (UniqueName: \"kubernetes.io/projected/00238ddf-6ee8-44a7-97a3-7d1563e1a1d7-kube-api-access-278cs\") pod \"nmstate-operator-858ddd8f98-8lh75\" (UID: \"00238ddf-6ee8-44a7-97a3-7d1563e1a1d7\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-8lh75" Oct 07 19:12:27 crc kubenswrapper[4825]: I1007 19:12:27.650203 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-278cs\" (UniqueName: \"kubernetes.io/projected/00238ddf-6ee8-44a7-97a3-7d1563e1a1d7-kube-api-access-278cs\") pod \"nmstate-operator-858ddd8f98-8lh75\" (UID: \"00238ddf-6ee8-44a7-97a3-7d1563e1a1d7\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-8lh75" Oct 07 19:12:27 crc kubenswrapper[4825]: I1007 19:12:27.685694 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-278cs\" (UniqueName: \"kubernetes.io/projected/00238ddf-6ee8-44a7-97a3-7d1563e1a1d7-kube-api-access-278cs\") pod \"nmstate-operator-858ddd8f98-8lh75\" (UID: \"00238ddf-6ee8-44a7-97a3-7d1563e1a1d7\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-8lh75" Oct 07 19:12:27 crc kubenswrapper[4825]: I1007 19:12:27.835483 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-8lh75" Oct 07 19:12:28 crc kubenswrapper[4825]: I1007 19:12:28.044951 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-8lh75"] Oct 07 19:12:28 crc kubenswrapper[4825]: I1007 19:12:28.063852 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-8lh75" event={"ID":"00238ddf-6ee8-44a7-97a3-7d1563e1a1d7","Type":"ContainerStarted","Data":"2f1a856ecb7f4003163003cbbc96ee2abd0e647ea64c180e18337c92921476e6"} Oct 07 19:12:31 crc kubenswrapper[4825]: I1007 19:12:31.087993 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-8lh75" event={"ID":"00238ddf-6ee8-44a7-97a3-7d1563e1a1d7","Type":"ContainerStarted","Data":"5adee3dd82a5d822cce290ab912c4d8e62a9bb0afe6dbe7660110f04021a68cf"} Oct 07 19:12:31 crc kubenswrapper[4825]: I1007 19:12:31.120280 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-8lh75" podStartSLOduration=2.198140774 podStartE2EDuration="4.120254359s" podCreationTimestamp="2025-10-07 19:12:27 +0000 UTC" firstStartedPulling="2025-10-07 19:12:28.050686949 +0000 UTC m=+736.872725586" lastFinishedPulling="2025-10-07 19:12:29.972800514 +0000 UTC m=+738.794839171" observedRunningTime="2025-10-07 19:12:31.117092887 +0000 UTC m=+739.939131564" watchObservedRunningTime="2025-10-07 19:12:31.120254359 +0000 UTC m=+739.942293036" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.172572 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-vv8pn"] Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.173621 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vv8pn" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.177789 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-hh7kl" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.214243 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-4kjx7"] Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.215093 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4kjx7" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.219010 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.225204 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-vv8pn"] Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.234388 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-4kjx7"] Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.242135 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-v8dxh"] Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.243047 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-v8dxh" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.304466 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-dzb8f"] Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.305219 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-dzb8f" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.323724 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a194e8ec-fa8a-4efb-af70-ea121bb7d835-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-4kjx7\" (UID: \"a194e8ec-fa8a-4efb-af70-ea121bb7d835\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4kjx7" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.325171 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q6lv\" (UniqueName: \"kubernetes.io/projected/a194e8ec-fa8a-4efb-af70-ea121bb7d835-kube-api-access-7q6lv\") pod \"nmstate-webhook-6cdbc54649-4kjx7\" (UID: \"a194e8ec-fa8a-4efb-af70-ea121bb7d835\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4kjx7" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.325206 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlz4h\" (UniqueName: \"kubernetes.io/projected/fef914bd-768f-4cd2-92c1-b5fb49e63ca8-kube-api-access-rlz4h\") pod \"nmstate-metrics-fdff9cb8d-vv8pn\" (UID: \"fef914bd-768f-4cd2-92c1-b5fb49e63ca8\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vv8pn" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.327551 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.327855 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.333607 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-bt8l2" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.338365 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-dzb8f"] Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.426467 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q6lv\" (UniqueName: \"kubernetes.io/projected/a194e8ec-fa8a-4efb-af70-ea121bb7d835-kube-api-access-7q6lv\") pod \"nmstate-webhook-6cdbc54649-4kjx7\" (UID: \"a194e8ec-fa8a-4efb-af70-ea121bb7d835\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4kjx7" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.426520 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlz4h\" (UniqueName: \"kubernetes.io/projected/fef914bd-768f-4cd2-92c1-b5fb49e63ca8-kube-api-access-rlz4h\") pod \"nmstate-metrics-fdff9cb8d-vv8pn\" (UID: \"fef914bd-768f-4cd2-92c1-b5fb49e63ca8\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vv8pn" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.426558 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bf7784c9-07ce-45f5-ad97-788cf3ef3b36-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-dzb8f\" (UID: \"bf7784c9-07ce-45f5-ad97-788cf3ef3b36\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-dzb8f" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.426597 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/2e774a23-bfdc-466c-92ed-4a9eb8f74c44-ovs-socket\") pod \"nmstate-handler-v8dxh\" (UID: \"2e774a23-bfdc-466c-92ed-4a9eb8f74c44\") " pod="openshift-nmstate/nmstate-handler-v8dxh" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.426632 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/2e774a23-bfdc-466c-92ed-4a9eb8f74c44-dbus-socket\") pod \"nmstate-handler-v8dxh\" (UID: \"2e774a23-bfdc-466c-92ed-4a9eb8f74c44\") " pod="openshift-nmstate/nmstate-handler-v8dxh" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.426658 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/2e774a23-bfdc-466c-92ed-4a9eb8f74c44-nmstate-lock\") pod \"nmstate-handler-v8dxh\" (UID: \"2e774a23-bfdc-466c-92ed-4a9eb8f74c44\") " pod="openshift-nmstate/nmstate-handler-v8dxh" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.426839 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqvrh\" (UniqueName: \"kubernetes.io/projected/bf7784c9-07ce-45f5-ad97-788cf3ef3b36-kube-api-access-hqvrh\") pod \"nmstate-console-plugin-6b874cbd85-dzb8f\" (UID: \"bf7784c9-07ce-45f5-ad97-788cf3ef3b36\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-dzb8f" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.426890 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv79n\" (UniqueName: \"kubernetes.io/projected/2e774a23-bfdc-466c-92ed-4a9eb8f74c44-kube-api-access-gv79n\") pod \"nmstate-handler-v8dxh\" (UID: \"2e774a23-bfdc-466c-92ed-4a9eb8f74c44\") " pod="openshift-nmstate/nmstate-handler-v8dxh" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.426945 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf7784c9-07ce-45f5-ad97-788cf3ef3b36-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-dzb8f\" (UID: \"bf7784c9-07ce-45f5-ad97-788cf3ef3b36\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-dzb8f" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.426980 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a194e8ec-fa8a-4efb-af70-ea121bb7d835-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-4kjx7\" (UID: \"a194e8ec-fa8a-4efb-af70-ea121bb7d835\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4kjx7" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.445251 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a194e8ec-fa8a-4efb-af70-ea121bb7d835-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-4kjx7\" (UID: \"a194e8ec-fa8a-4efb-af70-ea121bb7d835\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4kjx7" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.455408 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlz4h\" (UniqueName: \"kubernetes.io/projected/fef914bd-768f-4cd2-92c1-b5fb49e63ca8-kube-api-access-rlz4h\") pod \"nmstate-metrics-fdff9cb8d-vv8pn\" (UID: \"fef914bd-768f-4cd2-92c1-b5fb49e63ca8\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vv8pn" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.456416 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q6lv\" (UniqueName: \"kubernetes.io/projected/a194e8ec-fa8a-4efb-af70-ea121bb7d835-kube-api-access-7q6lv\") pod \"nmstate-webhook-6cdbc54649-4kjx7\" (UID: \"a194e8ec-fa8a-4efb-af70-ea121bb7d835\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4kjx7" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.494459 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-75cd8bb48b-5cvst"] Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.495385 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.498696 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vv8pn" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.511738 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75cd8bb48b-5cvst"] Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.527913 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/61c3b605-4097-4e21-af5d-8ad7b4d5585c-console-oauth-config\") pod \"console-75cd8bb48b-5cvst\" (UID: \"61c3b605-4097-4e21-af5d-8ad7b4d5585c\") " pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.527982 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/61c3b605-4097-4e21-af5d-8ad7b4d5585c-service-ca\") pod \"console-75cd8bb48b-5cvst\" (UID: \"61c3b605-4097-4e21-af5d-8ad7b4d5585c\") " pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.528029 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqvrh\" (UniqueName: \"kubernetes.io/projected/bf7784c9-07ce-45f5-ad97-788cf3ef3b36-kube-api-access-hqvrh\") pod \"nmstate-console-plugin-6b874cbd85-dzb8f\" (UID: \"bf7784c9-07ce-45f5-ad97-788cf3ef3b36\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-dzb8f" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.528064 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/61c3b605-4097-4e21-af5d-8ad7b4d5585c-console-config\") pod \"console-75cd8bb48b-5cvst\" (UID: \"61c3b605-4097-4e21-af5d-8ad7b4d5585c\") " pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.528096 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fh2b\" (UniqueName: \"kubernetes.io/projected/61c3b605-4097-4e21-af5d-8ad7b4d5585c-kube-api-access-8fh2b\") pod \"console-75cd8bb48b-5cvst\" (UID: \"61c3b605-4097-4e21-af5d-8ad7b4d5585c\") " pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.528125 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv79n\" (UniqueName: \"kubernetes.io/projected/2e774a23-bfdc-466c-92ed-4a9eb8f74c44-kube-api-access-gv79n\") pod \"nmstate-handler-v8dxh\" (UID: \"2e774a23-bfdc-466c-92ed-4a9eb8f74c44\") " pod="openshift-nmstate/nmstate-handler-v8dxh" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.528168 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf7784c9-07ce-45f5-ad97-788cf3ef3b36-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-dzb8f\" (UID: \"bf7784c9-07ce-45f5-ad97-788cf3ef3b36\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-dzb8f" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.528224 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/61c3b605-4097-4e21-af5d-8ad7b4d5585c-console-serving-cert\") pod \"console-75cd8bb48b-5cvst\" (UID: \"61c3b605-4097-4e21-af5d-8ad7b4d5585c\") " pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.528416 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bf7784c9-07ce-45f5-ad97-788cf3ef3b36-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-dzb8f\" (UID: \"bf7784c9-07ce-45f5-ad97-788cf3ef3b36\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-dzb8f" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.528462 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/2e774a23-bfdc-466c-92ed-4a9eb8f74c44-ovs-socket\") pod \"nmstate-handler-v8dxh\" (UID: \"2e774a23-bfdc-466c-92ed-4a9eb8f74c44\") " pod="openshift-nmstate/nmstate-handler-v8dxh" Oct 07 19:12:32 crc kubenswrapper[4825]: E1007 19:12:32.528474 4825 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.528489 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61c3b605-4097-4e21-af5d-8ad7b4d5585c-trusted-ca-bundle\") pod \"console-75cd8bb48b-5cvst\" (UID: \"61c3b605-4097-4e21-af5d-8ad7b4d5585c\") " pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.528518 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/2e774a23-bfdc-466c-92ed-4a9eb8f74c44-dbus-socket\") pod \"nmstate-handler-v8dxh\" (UID: \"2e774a23-bfdc-466c-92ed-4a9eb8f74c44\") " pod="openshift-nmstate/nmstate-handler-v8dxh" Oct 07 19:12:32 crc kubenswrapper[4825]: E1007 19:12:32.528542 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf7784c9-07ce-45f5-ad97-788cf3ef3b36-plugin-serving-cert podName:bf7784c9-07ce-45f5-ad97-788cf3ef3b36 nodeName:}" failed. No retries permitted until 2025-10-07 19:12:33.028516431 +0000 UTC m=+741.850555088 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/bf7784c9-07ce-45f5-ad97-788cf3ef3b36-plugin-serving-cert") pod "nmstate-console-plugin-6b874cbd85-dzb8f" (UID: "bf7784c9-07ce-45f5-ad97-788cf3ef3b36") : secret "plugin-serving-cert" not found Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.528570 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/61c3b605-4097-4e21-af5d-8ad7b4d5585c-oauth-serving-cert\") pod \"console-75cd8bb48b-5cvst\" (UID: \"61c3b605-4097-4e21-af5d-8ad7b4d5585c\") " pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.528612 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/2e774a23-bfdc-466c-92ed-4a9eb8f74c44-nmstate-lock\") pod \"nmstate-handler-v8dxh\" (UID: \"2e774a23-bfdc-466c-92ed-4a9eb8f74c44\") " pod="openshift-nmstate/nmstate-handler-v8dxh" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.528683 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/2e774a23-bfdc-466c-92ed-4a9eb8f74c44-ovs-socket\") pod \"nmstate-handler-v8dxh\" (UID: \"2e774a23-bfdc-466c-92ed-4a9eb8f74c44\") " pod="openshift-nmstate/nmstate-handler-v8dxh" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.528726 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/2e774a23-bfdc-466c-92ed-4a9eb8f74c44-dbus-socket\") pod \"nmstate-handler-v8dxh\" (UID: \"2e774a23-bfdc-466c-92ed-4a9eb8f74c44\") " pod="openshift-nmstate/nmstate-handler-v8dxh" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.528694 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/2e774a23-bfdc-466c-92ed-4a9eb8f74c44-nmstate-lock\") pod \"nmstate-handler-v8dxh\" (UID: \"2e774a23-bfdc-466c-92ed-4a9eb8f74c44\") " pod="openshift-nmstate/nmstate-handler-v8dxh" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.529371 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bf7784c9-07ce-45f5-ad97-788cf3ef3b36-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-dzb8f\" (UID: \"bf7784c9-07ce-45f5-ad97-788cf3ef3b36\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-dzb8f" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.540742 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4kjx7" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.545006 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv79n\" (UniqueName: \"kubernetes.io/projected/2e774a23-bfdc-466c-92ed-4a9eb8f74c44-kube-api-access-gv79n\") pod \"nmstate-handler-v8dxh\" (UID: \"2e774a23-bfdc-466c-92ed-4a9eb8f74c44\") " pod="openshift-nmstate/nmstate-handler-v8dxh" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.558244 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-v8dxh" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.558707 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqvrh\" (UniqueName: \"kubernetes.io/projected/bf7784c9-07ce-45f5-ad97-788cf3ef3b36-kube-api-access-hqvrh\") pod \"nmstate-console-plugin-6b874cbd85-dzb8f\" (UID: \"bf7784c9-07ce-45f5-ad97-788cf3ef3b36\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-dzb8f" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.630639 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/61c3b605-4097-4e21-af5d-8ad7b4d5585c-oauth-serving-cert\") pod \"console-75cd8bb48b-5cvst\" (UID: \"61c3b605-4097-4e21-af5d-8ad7b4d5585c\") " pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.630689 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/61c3b605-4097-4e21-af5d-8ad7b4d5585c-console-oauth-config\") pod \"console-75cd8bb48b-5cvst\" (UID: \"61c3b605-4097-4e21-af5d-8ad7b4d5585c\") " pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.630711 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/61c3b605-4097-4e21-af5d-8ad7b4d5585c-service-ca\") pod \"console-75cd8bb48b-5cvst\" (UID: \"61c3b605-4097-4e21-af5d-8ad7b4d5585c\") " pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.630733 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/61c3b605-4097-4e21-af5d-8ad7b4d5585c-console-config\") pod \"console-75cd8bb48b-5cvst\" (UID: \"61c3b605-4097-4e21-af5d-8ad7b4d5585c\") " pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.630747 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fh2b\" (UniqueName: \"kubernetes.io/projected/61c3b605-4097-4e21-af5d-8ad7b4d5585c-kube-api-access-8fh2b\") pod \"console-75cd8bb48b-5cvst\" (UID: \"61c3b605-4097-4e21-af5d-8ad7b4d5585c\") " pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.630794 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/61c3b605-4097-4e21-af5d-8ad7b4d5585c-console-serving-cert\") pod \"console-75cd8bb48b-5cvst\" (UID: \"61c3b605-4097-4e21-af5d-8ad7b4d5585c\") " pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.630826 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61c3b605-4097-4e21-af5d-8ad7b4d5585c-trusted-ca-bundle\") pod \"console-75cd8bb48b-5cvst\" (UID: \"61c3b605-4097-4e21-af5d-8ad7b4d5585c\") " pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.632087 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61c3b605-4097-4e21-af5d-8ad7b4d5585c-trusted-ca-bundle\") pod \"console-75cd8bb48b-5cvst\" (UID: \"61c3b605-4097-4e21-af5d-8ad7b4d5585c\") " pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.632747 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/61c3b605-4097-4e21-af5d-8ad7b4d5585c-service-ca\") pod \"console-75cd8bb48b-5cvst\" (UID: \"61c3b605-4097-4e21-af5d-8ad7b4d5585c\") " pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.633644 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/61c3b605-4097-4e21-af5d-8ad7b4d5585c-oauth-serving-cert\") pod \"console-75cd8bb48b-5cvst\" (UID: \"61c3b605-4097-4e21-af5d-8ad7b4d5585c\") " pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.633827 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/61c3b605-4097-4e21-af5d-8ad7b4d5585c-console-config\") pod \"console-75cd8bb48b-5cvst\" (UID: \"61c3b605-4097-4e21-af5d-8ad7b4d5585c\") " pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.636639 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/61c3b605-4097-4e21-af5d-8ad7b4d5585c-console-oauth-config\") pod \"console-75cd8bb48b-5cvst\" (UID: \"61c3b605-4097-4e21-af5d-8ad7b4d5585c\") " pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.636869 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/61c3b605-4097-4e21-af5d-8ad7b4d5585c-console-serving-cert\") pod \"console-75cd8bb48b-5cvst\" (UID: \"61c3b605-4097-4e21-af5d-8ad7b4d5585c\") " pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.654495 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fh2b\" (UniqueName: \"kubernetes.io/projected/61c3b605-4097-4e21-af5d-8ad7b4d5585c-kube-api-access-8fh2b\") pod \"console-75cd8bb48b-5cvst\" (UID: \"61c3b605-4097-4e21-af5d-8ad7b4d5585c\") " pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.770506 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-4kjx7"] Oct 07 19:12:32 crc kubenswrapper[4825]: W1007 19:12:32.778686 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda194e8ec_fa8a_4efb_af70_ea121bb7d835.slice/crio-d40fcef8187624888ae3e7a39525d3a690ee58008f7484726bbf36b9904a6b8e WatchSource:0}: Error finding container d40fcef8187624888ae3e7a39525d3a690ee58008f7484726bbf36b9904a6b8e: Status 404 returned error can't find the container with id d40fcef8187624888ae3e7a39525d3a690ee58008f7484726bbf36b9904a6b8e Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.809770 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:32 crc kubenswrapper[4825]: I1007 19:12:32.992954 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75cd8bb48b-5cvst"] Oct 07 19:12:32 crc kubenswrapper[4825]: W1007 19:12:32.998440 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61c3b605_4097_4e21_af5d_8ad7b4d5585c.slice/crio-54507c6b13b0ccb8f7ab9dd1e0d527f6b777ce7c1db0e0b6be46cad87c4ce2c9 WatchSource:0}: Error finding container 54507c6b13b0ccb8f7ab9dd1e0d527f6b777ce7c1db0e0b6be46cad87c4ce2c9: Status 404 returned error can't find the container with id 54507c6b13b0ccb8f7ab9dd1e0d527f6b777ce7c1db0e0b6be46cad87c4ce2c9 Oct 07 19:12:33 crc kubenswrapper[4825]: I1007 19:12:33.031167 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-vv8pn"] Oct 07 19:12:33 crc kubenswrapper[4825]: W1007 19:12:33.032202 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfef914bd_768f_4cd2_92c1_b5fb49e63ca8.slice/crio-567cf9ba7b298ecac302b19996f5561abedd830dbc146ad89e94f06b4ee7caa1 WatchSource:0}: Error finding container 567cf9ba7b298ecac302b19996f5561abedd830dbc146ad89e94f06b4ee7caa1: Status 404 returned error can't find the container with id 567cf9ba7b298ecac302b19996f5561abedd830dbc146ad89e94f06b4ee7caa1 Oct 07 19:12:33 crc kubenswrapper[4825]: I1007 19:12:33.039459 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf7784c9-07ce-45f5-ad97-788cf3ef3b36-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-dzb8f\" (UID: \"bf7784c9-07ce-45f5-ad97-788cf3ef3b36\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-dzb8f" Oct 07 19:12:33 crc kubenswrapper[4825]: I1007 19:12:33.044276 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf7784c9-07ce-45f5-ad97-788cf3ef3b36-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-dzb8f\" (UID: \"bf7784c9-07ce-45f5-ad97-788cf3ef3b36\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-dzb8f" Oct 07 19:12:33 crc kubenswrapper[4825]: I1007 19:12:33.101165 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-v8dxh" event={"ID":"2e774a23-bfdc-466c-92ed-4a9eb8f74c44","Type":"ContainerStarted","Data":"b03945e9c01593b8a8aae4b86c07c50244b01e0bf1a73b3f4043a831efb78475"} Oct 07 19:12:33 crc kubenswrapper[4825]: I1007 19:12:33.102524 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vv8pn" event={"ID":"fef914bd-768f-4cd2-92c1-b5fb49e63ca8","Type":"ContainerStarted","Data":"567cf9ba7b298ecac302b19996f5561abedd830dbc146ad89e94f06b4ee7caa1"} Oct 07 19:12:33 crc kubenswrapper[4825]: I1007 19:12:33.103422 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75cd8bb48b-5cvst" event={"ID":"61c3b605-4097-4e21-af5d-8ad7b4d5585c","Type":"ContainerStarted","Data":"54507c6b13b0ccb8f7ab9dd1e0d527f6b777ce7c1db0e0b6be46cad87c4ce2c9"} Oct 07 19:12:33 crc kubenswrapper[4825]: I1007 19:12:33.104420 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4kjx7" event={"ID":"a194e8ec-fa8a-4efb-af70-ea121bb7d835","Type":"ContainerStarted","Data":"d40fcef8187624888ae3e7a39525d3a690ee58008f7484726bbf36b9904a6b8e"} Oct 07 19:12:33 crc kubenswrapper[4825]: I1007 19:12:33.255559 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-dzb8f" Oct 07 19:12:33 crc kubenswrapper[4825]: I1007 19:12:33.488988 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-dzb8f"] Oct 07 19:12:33 crc kubenswrapper[4825]: W1007 19:12:33.500495 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf7784c9_07ce_45f5_ad97_788cf3ef3b36.slice/crio-0db7865840cd526e1e0d8dc645e5fdd314cc76aabae10d95bf0406b11567f7fd WatchSource:0}: Error finding container 0db7865840cd526e1e0d8dc645e5fdd314cc76aabae10d95bf0406b11567f7fd: Status 404 returned error can't find the container with id 0db7865840cd526e1e0d8dc645e5fdd314cc76aabae10d95bf0406b11567f7fd Oct 07 19:12:34 crc kubenswrapper[4825]: I1007 19:12:34.115696 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-dzb8f" event={"ID":"bf7784c9-07ce-45f5-ad97-788cf3ef3b36","Type":"ContainerStarted","Data":"0db7865840cd526e1e0d8dc645e5fdd314cc76aabae10d95bf0406b11567f7fd"} Oct 07 19:12:34 crc kubenswrapper[4825]: I1007 19:12:34.118682 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75cd8bb48b-5cvst" event={"ID":"61c3b605-4097-4e21-af5d-8ad7b4d5585c","Type":"ContainerStarted","Data":"25c098948c8991051230900c9a5adfb2b4b4113f585eb7fabf3a287a40d6d1ef"} Oct 07 19:12:34 crc kubenswrapper[4825]: I1007 19:12:34.142164 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-75cd8bb48b-5cvst" podStartSLOduration=2.142129918 podStartE2EDuration="2.142129918s" podCreationTimestamp="2025-10-07 19:12:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:12:34.141800647 +0000 UTC m=+742.963839324" watchObservedRunningTime="2025-10-07 19:12:34.142129918 +0000 UTC m=+742.964168605" Oct 07 19:12:36 crc kubenswrapper[4825]: I1007 19:12:36.131993 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vv8pn" event={"ID":"fef914bd-768f-4cd2-92c1-b5fb49e63ca8","Type":"ContainerStarted","Data":"322410a5b4e4aa49fd05f4057105da4a3833727f81717fa053285ff946ec414f"} Oct 07 19:12:36 crc kubenswrapper[4825]: I1007 19:12:36.134855 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-dzb8f" event={"ID":"bf7784c9-07ce-45f5-ad97-788cf3ef3b36","Type":"ContainerStarted","Data":"c89f69b8e25cfcf21cdce7a5bc6acc8cbb10c1acde4c49052b5245748a6f6009"} Oct 07 19:12:36 crc kubenswrapper[4825]: I1007 19:12:36.137959 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4kjx7" event={"ID":"a194e8ec-fa8a-4efb-af70-ea121bb7d835","Type":"ContainerStarted","Data":"dd20b5790fefa72e5001a63a26c58c60dfa86c9f7b96e6b90d71882592bfd80e"} Oct 07 19:12:36 crc kubenswrapper[4825]: I1007 19:12:36.138251 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4kjx7" Oct 07 19:12:36 crc kubenswrapper[4825]: I1007 19:12:36.156311 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-dzb8f" podStartSLOduration=1.8520112690000001 podStartE2EDuration="4.156282383s" podCreationTimestamp="2025-10-07 19:12:32 +0000 UTC" firstStartedPulling="2025-10-07 19:12:33.502672806 +0000 UTC m=+742.324711453" lastFinishedPulling="2025-10-07 19:12:35.80694393 +0000 UTC m=+744.628982567" observedRunningTime="2025-10-07 19:12:36.153449442 +0000 UTC m=+744.975488129" watchObservedRunningTime="2025-10-07 19:12:36.156282383 +0000 UTC m=+744.978321070" Oct 07 19:12:36 crc kubenswrapper[4825]: I1007 19:12:36.191879 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4kjx7" podStartSLOduration=1.164158211 podStartE2EDuration="4.191844277s" podCreationTimestamp="2025-10-07 19:12:32 +0000 UTC" firstStartedPulling="2025-10-07 19:12:32.780651229 +0000 UTC m=+741.602689856" lastFinishedPulling="2025-10-07 19:12:35.808337285 +0000 UTC m=+744.630375922" observedRunningTime="2025-10-07 19:12:36.187396553 +0000 UTC m=+745.009435230" watchObservedRunningTime="2025-10-07 19:12:36.191844277 +0000 UTC m=+745.013882964" Oct 07 19:12:37 crc kubenswrapper[4825]: I1007 19:12:37.145633 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-v8dxh" event={"ID":"2e774a23-bfdc-466c-92ed-4a9eb8f74c44","Type":"ContainerStarted","Data":"c84e156c7f3b9bd2142fb1efd07978a0b1512895147fb646b8feb724d4bbfdd9"} Oct 07 19:12:37 crc kubenswrapper[4825]: I1007 19:12:37.164212 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-v8dxh" podStartSLOduration=1.481818665 podStartE2EDuration="5.164191842s" podCreationTimestamp="2025-10-07 19:12:32 +0000 UTC" firstStartedPulling="2025-10-07 19:12:32.59500596 +0000 UTC m=+741.417044597" lastFinishedPulling="2025-10-07 19:12:36.277379127 +0000 UTC m=+745.099417774" observedRunningTime="2025-10-07 19:12:37.162174518 +0000 UTC m=+745.984213165" watchObservedRunningTime="2025-10-07 19:12:37.164191842 +0000 UTC m=+745.986230499" Oct 07 19:12:37 crc kubenswrapper[4825]: I1007 19:12:37.559795 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-v8dxh" Oct 07 19:12:39 crc kubenswrapper[4825]: I1007 19:12:39.162554 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vv8pn" event={"ID":"fef914bd-768f-4cd2-92c1-b5fb49e63ca8","Type":"ContainerStarted","Data":"d008e8f8bdc678b586b89b0922496e8730325d5256442b47fa65ac65c05301f7"} Oct 07 19:12:39 crc kubenswrapper[4825]: I1007 19:12:39.190705 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vv8pn" podStartSLOduration=1.695550517 podStartE2EDuration="7.190671425s" podCreationTimestamp="2025-10-07 19:12:32 +0000 UTC" firstStartedPulling="2025-10-07 19:12:33.034858653 +0000 UTC m=+741.856897290" lastFinishedPulling="2025-10-07 19:12:38.529979521 +0000 UTC m=+747.352018198" observedRunningTime="2025-10-07 19:12:39.186404948 +0000 UTC m=+748.008443645" watchObservedRunningTime="2025-10-07 19:12:39.190671425 +0000 UTC m=+748.012710092" Oct 07 19:12:42 crc kubenswrapper[4825]: I1007 19:12:42.599637 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-v8dxh" Oct 07 19:12:42 crc kubenswrapper[4825]: I1007 19:12:42.810965 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:42 crc kubenswrapper[4825]: I1007 19:12:42.811052 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:42 crc kubenswrapper[4825]: I1007 19:12:42.817888 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:43 crc kubenswrapper[4825]: I1007 19:12:43.195964 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-75cd8bb48b-5cvst" Oct 07 19:12:43 crc kubenswrapper[4825]: I1007 19:12:43.281388 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-sqfnk"] Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.182954 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x5nrv"] Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.183560 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" podUID="08f97853-1190-438d-91b7-f811400b541c" containerName="controller-manager" containerID="cri-o://cf5b0063f5a6f724902fdbbd76997b0f4474bb0875050e0568207467b16f8f18" gracePeriod=30 Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.277756 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7"] Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.277998 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" podUID="a3b4f007-8217-4308-996e-394b0c3d072c" containerName="route-controller-manager" containerID="cri-o://ec6e0d33aa1e27ad815a68cbb21b68747cadd47f838177b28e609c72ea2a814c" gracePeriod=30 Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.564313 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.608523 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.762456 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08f97853-1190-438d-91b7-f811400b541c-client-ca\") pod \"08f97853-1190-438d-91b7-f811400b541c\" (UID: \"08f97853-1190-438d-91b7-f811400b541c\") " Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.762556 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3b4f007-8217-4308-996e-394b0c3d072c-serving-cert\") pod \"a3b4f007-8217-4308-996e-394b0c3d072c\" (UID: \"a3b4f007-8217-4308-996e-394b0c3d072c\") " Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.762608 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08f97853-1190-438d-91b7-f811400b541c-config\") pod \"08f97853-1190-438d-91b7-f811400b541c\" (UID: \"08f97853-1190-438d-91b7-f811400b541c\") " Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.762683 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08f97853-1190-438d-91b7-f811400b541c-proxy-ca-bundles\") pod \"08f97853-1190-438d-91b7-f811400b541c\" (UID: \"08f97853-1190-438d-91b7-f811400b541c\") " Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.762790 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08f97853-1190-438d-91b7-f811400b541c-serving-cert\") pod \"08f97853-1190-438d-91b7-f811400b541c\" (UID: \"08f97853-1190-438d-91b7-f811400b541c\") " Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.762819 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3b4f007-8217-4308-996e-394b0c3d072c-client-ca\") pod \"a3b4f007-8217-4308-996e-394b0c3d072c\" (UID: \"a3b4f007-8217-4308-996e-394b0c3d072c\") " Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.762853 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3b4f007-8217-4308-996e-394b0c3d072c-config\") pod \"a3b4f007-8217-4308-996e-394b0c3d072c\" (UID: \"a3b4f007-8217-4308-996e-394b0c3d072c\") " Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.763039 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hzwl\" (UniqueName: \"kubernetes.io/projected/08f97853-1190-438d-91b7-f811400b541c-kube-api-access-2hzwl\") pod \"08f97853-1190-438d-91b7-f811400b541c\" (UID: \"08f97853-1190-438d-91b7-f811400b541c\") " Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.763085 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4n7\" (UniqueName: \"kubernetes.io/projected/a3b4f007-8217-4308-996e-394b0c3d072c-kube-api-access-2d4n7\") pod \"a3b4f007-8217-4308-996e-394b0c3d072c\" (UID: \"a3b4f007-8217-4308-996e-394b0c3d072c\") " Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.763542 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08f97853-1190-438d-91b7-f811400b541c-client-ca" (OuterVolumeSpecName: "client-ca") pod "08f97853-1190-438d-91b7-f811400b541c" (UID: "08f97853-1190-438d-91b7-f811400b541c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.763771 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08f97853-1190-438d-91b7-f811400b541c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "08f97853-1190-438d-91b7-f811400b541c" (UID: "08f97853-1190-438d-91b7-f811400b541c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.764072 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3b4f007-8217-4308-996e-394b0c3d072c-client-ca" (OuterVolumeSpecName: "client-ca") pod "a3b4f007-8217-4308-996e-394b0c3d072c" (UID: "a3b4f007-8217-4308-996e-394b0c3d072c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.763800 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08f97853-1190-438d-91b7-f811400b541c-config" (OuterVolumeSpecName: "config") pod "08f97853-1190-438d-91b7-f811400b541c" (UID: "08f97853-1190-438d-91b7-f811400b541c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.764271 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3b4f007-8217-4308-996e-394b0c3d072c-config" (OuterVolumeSpecName: "config") pod "a3b4f007-8217-4308-996e-394b0c3d072c" (UID: "a3b4f007-8217-4308-996e-394b0c3d072c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.768402 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3b4f007-8217-4308-996e-394b0c3d072c-kube-api-access-2d4n7" (OuterVolumeSpecName: "kube-api-access-2d4n7") pod "a3b4f007-8217-4308-996e-394b0c3d072c" (UID: "a3b4f007-8217-4308-996e-394b0c3d072c"). InnerVolumeSpecName "kube-api-access-2d4n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.768459 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3b4f007-8217-4308-996e-394b0c3d072c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a3b4f007-8217-4308-996e-394b0c3d072c" (UID: "a3b4f007-8217-4308-996e-394b0c3d072c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.768793 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08f97853-1190-438d-91b7-f811400b541c-kube-api-access-2hzwl" (OuterVolumeSpecName: "kube-api-access-2hzwl") pod "08f97853-1190-438d-91b7-f811400b541c" (UID: "08f97853-1190-438d-91b7-f811400b541c"). InnerVolumeSpecName "kube-api-access-2hzwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.769377 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f97853-1190-438d-91b7-f811400b541c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "08f97853-1190-438d-91b7-f811400b541c" (UID: "08f97853-1190-438d-91b7-f811400b541c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.864762 4825 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08f97853-1190-438d-91b7-f811400b541c-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.864810 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3b4f007-8217-4308-996e-394b0c3d072c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.864822 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08f97853-1190-438d-91b7-f811400b541c-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.864832 4825 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08f97853-1190-438d-91b7-f811400b541c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.864848 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08f97853-1190-438d-91b7-f811400b541c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.864859 4825 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3b4f007-8217-4308-996e-394b0c3d072c-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.864881 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3b4f007-8217-4308-996e-394b0c3d072c-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.864891 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hzwl\" (UniqueName: \"kubernetes.io/projected/08f97853-1190-438d-91b7-f811400b541c-kube-api-access-2hzwl\") on node \"crc\" DevicePath \"\"" Oct 07 19:12:48 crc kubenswrapper[4825]: I1007 19:12:48.864900 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4n7\" (UniqueName: \"kubernetes.io/projected/a3b4f007-8217-4308-996e-394b0c3d072c-kube-api-access-2d4n7\") on node \"crc\" DevicePath \"\"" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.247835 4825 generic.go:334] "Generic (PLEG): container finished" podID="08f97853-1190-438d-91b7-f811400b541c" containerID="cf5b0063f5a6f724902fdbbd76997b0f4474bb0875050e0568207467b16f8f18" exitCode=0 Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.247948 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" event={"ID":"08f97853-1190-438d-91b7-f811400b541c","Type":"ContainerDied","Data":"cf5b0063f5a6f724902fdbbd76997b0f4474bb0875050e0568207467b16f8f18"} Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.248001 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" event={"ID":"08f97853-1190-438d-91b7-f811400b541c","Type":"ContainerDied","Data":"e0cc0fb1db28d9d8b837c660f9ffa3b82ba0f5975c5bce267bb11a0eb850dbca"} Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.248039 4825 scope.go:117] "RemoveContainer" containerID="cf5b0063f5a6f724902fdbbd76997b0f4474bb0875050e0568207467b16f8f18" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.248274 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x5nrv" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.257307 4825 generic.go:334] "Generic (PLEG): container finished" podID="a3b4f007-8217-4308-996e-394b0c3d072c" containerID="ec6e0d33aa1e27ad815a68cbb21b68747cadd47f838177b28e609c72ea2a814c" exitCode=0 Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.257393 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" event={"ID":"a3b4f007-8217-4308-996e-394b0c3d072c","Type":"ContainerDied","Data":"ec6e0d33aa1e27ad815a68cbb21b68747cadd47f838177b28e609c72ea2a814c"} Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.257435 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" event={"ID":"a3b4f007-8217-4308-996e-394b0c3d072c","Type":"ContainerDied","Data":"958bd3482e10c9f530a378be409e1bec4735655ac263a940295c21ac720ff270"} Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.257538 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.279078 4825 scope.go:117] "RemoveContainer" containerID="cf5b0063f5a6f724902fdbbd76997b0f4474bb0875050e0568207467b16f8f18" Oct 07 19:12:49 crc kubenswrapper[4825]: E1007 19:12:49.279622 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf5b0063f5a6f724902fdbbd76997b0f4474bb0875050e0568207467b16f8f18\": container with ID starting with cf5b0063f5a6f724902fdbbd76997b0f4474bb0875050e0568207467b16f8f18 not found: ID does not exist" containerID="cf5b0063f5a6f724902fdbbd76997b0f4474bb0875050e0568207467b16f8f18" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.279693 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf5b0063f5a6f724902fdbbd76997b0f4474bb0875050e0568207467b16f8f18"} err="failed to get container status \"cf5b0063f5a6f724902fdbbd76997b0f4474bb0875050e0568207467b16f8f18\": rpc error: code = NotFound desc = could not find container \"cf5b0063f5a6f724902fdbbd76997b0f4474bb0875050e0568207467b16f8f18\": container with ID starting with cf5b0063f5a6f724902fdbbd76997b0f4474bb0875050e0568207467b16f8f18 not found: ID does not exist" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.279720 4825 scope.go:117] "RemoveContainer" containerID="ec6e0d33aa1e27ad815a68cbb21b68747cadd47f838177b28e609c72ea2a814c" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.292510 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x5nrv"] Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.306705 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x5nrv"] Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.306789 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7"] Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.309909 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr8n7"] Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.324490 4825 scope.go:117] "RemoveContainer" containerID="ec6e0d33aa1e27ad815a68cbb21b68747cadd47f838177b28e609c72ea2a814c" Oct 07 19:12:49 crc kubenswrapper[4825]: E1007 19:12:49.324894 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec6e0d33aa1e27ad815a68cbb21b68747cadd47f838177b28e609c72ea2a814c\": container with ID starting with ec6e0d33aa1e27ad815a68cbb21b68747cadd47f838177b28e609c72ea2a814c not found: ID does not exist" containerID="ec6e0d33aa1e27ad815a68cbb21b68747cadd47f838177b28e609c72ea2a814c" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.324949 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec6e0d33aa1e27ad815a68cbb21b68747cadd47f838177b28e609c72ea2a814c"} err="failed to get container status \"ec6e0d33aa1e27ad815a68cbb21b68747cadd47f838177b28e609c72ea2a814c\": rpc error: code = NotFound desc = could not find container \"ec6e0d33aa1e27ad815a68cbb21b68747cadd47f838177b28e609c72ea2a814c\": container with ID starting with ec6e0d33aa1e27ad815a68cbb21b68747cadd47f838177b28e609c72ea2a814c not found: ID does not exist" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.643597 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d75786b4c-gz6k9"] Oct 07 19:12:49 crc kubenswrapper[4825]: E1007 19:12:49.643935 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08f97853-1190-438d-91b7-f811400b541c" containerName="controller-manager" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.643957 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="08f97853-1190-438d-91b7-f811400b541c" containerName="controller-manager" Oct 07 19:12:49 crc kubenswrapper[4825]: E1007 19:12:49.643991 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3b4f007-8217-4308-996e-394b0c3d072c" containerName="route-controller-manager" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.644004 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b4f007-8217-4308-996e-394b0c3d072c" containerName="route-controller-manager" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.644195 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="08f97853-1190-438d-91b7-f811400b541c" containerName="controller-manager" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.644224 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3b4f007-8217-4308-996e-394b0c3d072c" containerName="route-controller-manager" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.644844 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d75786b4c-gz6k9" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.648013 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.648401 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.648952 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.649457 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.649811 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.650397 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.695772 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d75786b4c-gz6k9"] Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.788747 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20c46e03-701c-4c77-b670-7dc3c299f45f-client-ca\") pod \"route-controller-manager-6d75786b4c-gz6k9\" (UID: \"20c46e03-701c-4c77-b670-7dc3c299f45f\") " pod="openshift-route-controller-manager/route-controller-manager-6d75786b4c-gz6k9" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.788839 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hf55\" (UniqueName: \"kubernetes.io/projected/20c46e03-701c-4c77-b670-7dc3c299f45f-kube-api-access-4hf55\") pod \"route-controller-manager-6d75786b4c-gz6k9\" (UID: \"20c46e03-701c-4c77-b670-7dc3c299f45f\") " pod="openshift-route-controller-manager/route-controller-manager-6d75786b4c-gz6k9" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.788867 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20c46e03-701c-4c77-b670-7dc3c299f45f-config\") pod \"route-controller-manager-6d75786b4c-gz6k9\" (UID: \"20c46e03-701c-4c77-b670-7dc3c299f45f\") " pod="openshift-route-controller-manager/route-controller-manager-6d75786b4c-gz6k9" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.788885 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20c46e03-701c-4c77-b670-7dc3c299f45f-serving-cert\") pod \"route-controller-manager-6d75786b4c-gz6k9\" (UID: \"20c46e03-701c-4c77-b670-7dc3c299f45f\") " pod="openshift-route-controller-manager/route-controller-manager-6d75786b4c-gz6k9" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.802070 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08f97853-1190-438d-91b7-f811400b541c" path="/var/lib/kubelet/pods/08f97853-1190-438d-91b7-f811400b541c/volumes" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.802767 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3b4f007-8217-4308-996e-394b0c3d072c" path="/var/lib/kubelet/pods/a3b4f007-8217-4308-996e-394b0c3d072c/volumes" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.889543 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20c46e03-701c-4c77-b670-7dc3c299f45f-client-ca\") pod \"route-controller-manager-6d75786b4c-gz6k9\" (UID: \"20c46e03-701c-4c77-b670-7dc3c299f45f\") " pod="openshift-route-controller-manager/route-controller-manager-6d75786b4c-gz6k9" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.889607 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hf55\" (UniqueName: \"kubernetes.io/projected/20c46e03-701c-4c77-b670-7dc3c299f45f-kube-api-access-4hf55\") pod \"route-controller-manager-6d75786b4c-gz6k9\" (UID: \"20c46e03-701c-4c77-b670-7dc3c299f45f\") " pod="openshift-route-controller-manager/route-controller-manager-6d75786b4c-gz6k9" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.889628 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20c46e03-701c-4c77-b670-7dc3c299f45f-config\") pod \"route-controller-manager-6d75786b4c-gz6k9\" (UID: \"20c46e03-701c-4c77-b670-7dc3c299f45f\") " pod="openshift-route-controller-manager/route-controller-manager-6d75786b4c-gz6k9" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.889653 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20c46e03-701c-4c77-b670-7dc3c299f45f-serving-cert\") pod \"route-controller-manager-6d75786b4c-gz6k9\" (UID: \"20c46e03-701c-4c77-b670-7dc3c299f45f\") " pod="openshift-route-controller-manager/route-controller-manager-6d75786b4c-gz6k9" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.890651 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20c46e03-701c-4c77-b670-7dc3c299f45f-client-ca\") pod \"route-controller-manager-6d75786b4c-gz6k9\" (UID: \"20c46e03-701c-4c77-b670-7dc3c299f45f\") " pod="openshift-route-controller-manager/route-controller-manager-6d75786b4c-gz6k9" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.890902 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20c46e03-701c-4c77-b670-7dc3c299f45f-config\") pod \"route-controller-manager-6d75786b4c-gz6k9\" (UID: \"20c46e03-701c-4c77-b670-7dc3c299f45f\") " pod="openshift-route-controller-manager/route-controller-manager-6d75786b4c-gz6k9" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.899261 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20c46e03-701c-4c77-b670-7dc3c299f45f-serving-cert\") pod \"route-controller-manager-6d75786b4c-gz6k9\" (UID: \"20c46e03-701c-4c77-b670-7dc3c299f45f\") " pod="openshift-route-controller-manager/route-controller-manager-6d75786b4c-gz6k9" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.918036 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hf55\" (UniqueName: \"kubernetes.io/projected/20c46e03-701c-4c77-b670-7dc3c299f45f-kube-api-access-4hf55\") pod \"route-controller-manager-6d75786b4c-gz6k9\" (UID: \"20c46e03-701c-4c77-b670-7dc3c299f45f\") " pod="openshift-route-controller-manager/route-controller-manager-6d75786b4c-gz6k9" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.986323 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-79bfcd4589-c6lbv"] Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.987441 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79bfcd4589-c6lbv" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.991186 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.992869 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.993282 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.993435 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.993683 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.993501 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 07 19:12:49 crc kubenswrapper[4825]: I1007 19:12:49.994217 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d75786b4c-gz6k9" Oct 07 19:12:50 crc kubenswrapper[4825]: I1007 19:12:50.008305 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 07 19:12:50 crc kubenswrapper[4825]: I1007 19:12:50.013216 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79bfcd4589-c6lbv"] Oct 07 19:12:50 crc kubenswrapper[4825]: I1007 19:12:50.097295 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/084d14a3-7774-4874-bce1-5d3e8a61e935-proxy-ca-bundles\") pod \"controller-manager-79bfcd4589-c6lbv\" (UID: \"084d14a3-7774-4874-bce1-5d3e8a61e935\") " pod="openshift-controller-manager/controller-manager-79bfcd4589-c6lbv" Oct 07 19:12:50 crc kubenswrapper[4825]: I1007 19:12:50.097434 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d8tf\" (UniqueName: \"kubernetes.io/projected/084d14a3-7774-4874-bce1-5d3e8a61e935-kube-api-access-7d8tf\") pod \"controller-manager-79bfcd4589-c6lbv\" (UID: \"084d14a3-7774-4874-bce1-5d3e8a61e935\") " pod="openshift-controller-manager/controller-manager-79bfcd4589-c6lbv" Oct 07 19:12:50 crc kubenswrapper[4825]: I1007 19:12:50.097482 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/084d14a3-7774-4874-bce1-5d3e8a61e935-config\") pod \"controller-manager-79bfcd4589-c6lbv\" (UID: \"084d14a3-7774-4874-bce1-5d3e8a61e935\") " pod="openshift-controller-manager/controller-manager-79bfcd4589-c6lbv" Oct 07 19:12:50 crc kubenswrapper[4825]: I1007 19:12:50.097515 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/084d14a3-7774-4874-bce1-5d3e8a61e935-client-ca\") pod \"controller-manager-79bfcd4589-c6lbv\" (UID: \"084d14a3-7774-4874-bce1-5d3e8a61e935\") " pod="openshift-controller-manager/controller-manager-79bfcd4589-c6lbv" Oct 07 19:12:50 crc kubenswrapper[4825]: I1007 19:12:50.097542 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/084d14a3-7774-4874-bce1-5d3e8a61e935-serving-cert\") pod \"controller-manager-79bfcd4589-c6lbv\" (UID: \"084d14a3-7774-4874-bce1-5d3e8a61e935\") " pod="openshift-controller-manager/controller-manager-79bfcd4589-c6lbv" Oct 07 19:12:50 crc kubenswrapper[4825]: I1007 19:12:50.198240 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/084d14a3-7774-4874-bce1-5d3e8a61e935-client-ca\") pod \"controller-manager-79bfcd4589-c6lbv\" (UID: \"084d14a3-7774-4874-bce1-5d3e8a61e935\") " pod="openshift-controller-manager/controller-manager-79bfcd4589-c6lbv" Oct 07 19:12:50 crc kubenswrapper[4825]: I1007 19:12:50.198287 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/084d14a3-7774-4874-bce1-5d3e8a61e935-serving-cert\") pod \"controller-manager-79bfcd4589-c6lbv\" (UID: \"084d14a3-7774-4874-bce1-5d3e8a61e935\") " pod="openshift-controller-manager/controller-manager-79bfcd4589-c6lbv" Oct 07 19:12:50 crc kubenswrapper[4825]: I1007 19:12:50.198313 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/084d14a3-7774-4874-bce1-5d3e8a61e935-proxy-ca-bundles\") pod \"controller-manager-79bfcd4589-c6lbv\" (UID: \"084d14a3-7774-4874-bce1-5d3e8a61e935\") " pod="openshift-controller-manager/controller-manager-79bfcd4589-c6lbv" Oct 07 19:12:50 crc kubenswrapper[4825]: I1007 19:12:50.198367 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d8tf\" (UniqueName: \"kubernetes.io/projected/084d14a3-7774-4874-bce1-5d3e8a61e935-kube-api-access-7d8tf\") pod \"controller-manager-79bfcd4589-c6lbv\" (UID: \"084d14a3-7774-4874-bce1-5d3e8a61e935\") " pod="openshift-controller-manager/controller-manager-79bfcd4589-c6lbv" Oct 07 19:12:50 crc kubenswrapper[4825]: I1007 19:12:50.198388 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/084d14a3-7774-4874-bce1-5d3e8a61e935-config\") pod \"controller-manager-79bfcd4589-c6lbv\" (UID: \"084d14a3-7774-4874-bce1-5d3e8a61e935\") " pod="openshift-controller-manager/controller-manager-79bfcd4589-c6lbv" Oct 07 19:12:50 crc kubenswrapper[4825]: I1007 19:12:50.199781 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/084d14a3-7774-4874-bce1-5d3e8a61e935-client-ca\") pod \"controller-manager-79bfcd4589-c6lbv\" (UID: \"084d14a3-7774-4874-bce1-5d3e8a61e935\") " pod="openshift-controller-manager/controller-manager-79bfcd4589-c6lbv" Oct 07 19:12:50 crc kubenswrapper[4825]: I1007 19:12:50.200887 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/084d14a3-7774-4874-bce1-5d3e8a61e935-proxy-ca-bundles\") pod \"controller-manager-79bfcd4589-c6lbv\" (UID: \"084d14a3-7774-4874-bce1-5d3e8a61e935\") " pod="openshift-controller-manager/controller-manager-79bfcd4589-c6lbv" Oct 07 19:12:50 crc kubenswrapper[4825]: I1007 19:12:50.202277 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/084d14a3-7774-4874-bce1-5d3e8a61e935-serving-cert\") pod \"controller-manager-79bfcd4589-c6lbv\" (UID: \"084d14a3-7774-4874-bce1-5d3e8a61e935\") " pod="openshift-controller-manager/controller-manager-79bfcd4589-c6lbv" Oct 07 19:12:50 crc kubenswrapper[4825]: I1007 19:12:50.203143 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/084d14a3-7774-4874-bce1-5d3e8a61e935-config\") pod \"controller-manager-79bfcd4589-c6lbv\" (UID: \"084d14a3-7774-4874-bce1-5d3e8a61e935\") " pod="openshift-controller-manager/controller-manager-79bfcd4589-c6lbv" Oct 07 19:12:50 crc kubenswrapper[4825]: I1007 19:12:50.217123 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d8tf\" (UniqueName: \"kubernetes.io/projected/084d14a3-7774-4874-bce1-5d3e8a61e935-kube-api-access-7d8tf\") pod \"controller-manager-79bfcd4589-c6lbv\" (UID: \"084d14a3-7774-4874-bce1-5d3e8a61e935\") " pod="openshift-controller-manager/controller-manager-79bfcd4589-c6lbv" Oct 07 19:12:50 crc kubenswrapper[4825]: I1007 19:12:50.225043 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d75786b4c-gz6k9"] Oct 07 19:12:50 crc kubenswrapper[4825]: I1007 19:12:50.263860 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d75786b4c-gz6k9" event={"ID":"20c46e03-701c-4c77-b670-7dc3c299f45f","Type":"ContainerStarted","Data":"20ba85e424ee34ec09d66dd59ef10bf603829ce6f97f4c461e9584cf2f132635"} Oct 07 19:12:50 crc kubenswrapper[4825]: I1007 19:12:50.312508 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79bfcd4589-c6lbv" Oct 07 19:12:50 crc kubenswrapper[4825]: I1007 19:12:50.793707 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79bfcd4589-c6lbv"] Oct 07 19:12:51 crc kubenswrapper[4825]: I1007 19:12:51.271897 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d75786b4c-gz6k9" event={"ID":"20c46e03-701c-4c77-b670-7dc3c299f45f","Type":"ContainerStarted","Data":"023d23a1930a300307fe09ddb1b046a606993f31d81b6521f7ec5fe1ab7240ba"} Oct 07 19:12:51 crc kubenswrapper[4825]: I1007 19:12:51.274518 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d75786b4c-gz6k9" Oct 07 19:12:51 crc kubenswrapper[4825]: I1007 19:12:51.278154 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79bfcd4589-c6lbv" event={"ID":"084d14a3-7774-4874-bce1-5d3e8a61e935","Type":"ContainerStarted","Data":"64eaa7c753108617d68da2c36a4d77ad78c911fc18747593752592e1503863d9"} Oct 07 19:12:51 crc kubenswrapper[4825]: I1007 19:12:51.278199 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79bfcd4589-c6lbv" event={"ID":"084d14a3-7774-4874-bce1-5d3e8a61e935","Type":"ContainerStarted","Data":"2ec01ab1cb0c9ad29be38c55bc14a50840d3c15b8e125682984f8d01bf7160e5"} Oct 07 19:12:51 crc kubenswrapper[4825]: I1007 19:12:51.278355 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-79bfcd4589-c6lbv" Oct 07 19:12:51 crc kubenswrapper[4825]: I1007 19:12:51.280116 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d75786b4c-gz6k9" Oct 07 19:12:51 crc kubenswrapper[4825]: I1007 19:12:51.292603 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-79bfcd4589-c6lbv" Oct 07 19:12:51 crc kubenswrapper[4825]: I1007 19:12:51.329251 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d75786b4c-gz6k9" podStartSLOduration=2.329201482 podStartE2EDuration="2.329201482s" podCreationTimestamp="2025-10-07 19:12:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:12:51.30736068 +0000 UTC m=+760.129399317" watchObservedRunningTime="2025-10-07 19:12:51.329201482 +0000 UTC m=+760.151240119" Oct 07 19:12:52 crc kubenswrapper[4825]: I1007 19:12:52.547137 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4kjx7" Oct 07 19:12:52 crc kubenswrapper[4825]: I1007 19:12:52.564105 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-79bfcd4589-c6lbv" podStartSLOduration=4.56408334 podStartE2EDuration="4.56408334s" podCreationTimestamp="2025-10-07 19:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:12:51.376770241 +0000 UTC m=+760.198808878" watchObservedRunningTime="2025-10-07 19:12:52.56408334 +0000 UTC m=+761.386121977" Oct 07 19:12:55 crc kubenswrapper[4825]: I1007 19:12:55.985350 4825 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 07 19:13:05 crc kubenswrapper[4825]: I1007 19:13:05.708945 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:13:05 crc kubenswrapper[4825]: I1007 19:13:05.709663 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:13:07 crc kubenswrapper[4825]: I1007 19:13:07.170809 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t"] Oct 07 19:13:07 crc kubenswrapper[4825]: I1007 19:13:07.172158 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t" Oct 07 19:13:07 crc kubenswrapper[4825]: I1007 19:13:07.174699 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 07 19:13:07 crc kubenswrapper[4825]: I1007 19:13:07.187099 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t"] Oct 07 19:13:07 crc kubenswrapper[4825]: I1007 19:13:07.272303 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpfkd\" (UniqueName: \"kubernetes.io/projected/e49fd630-5fe7-4b4a-a455-9f53417191bf-kube-api-access-zpfkd\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t\" (UID: \"e49fd630-5fe7-4b4a-a455-9f53417191bf\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t" Oct 07 19:13:07 crc kubenswrapper[4825]: I1007 19:13:07.272378 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e49fd630-5fe7-4b4a-a455-9f53417191bf-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t\" (UID: \"e49fd630-5fe7-4b4a-a455-9f53417191bf\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t" Oct 07 19:13:07 crc kubenswrapper[4825]: I1007 19:13:07.272413 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e49fd630-5fe7-4b4a-a455-9f53417191bf-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t\" (UID: \"e49fd630-5fe7-4b4a-a455-9f53417191bf\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t" Oct 07 19:13:07 crc kubenswrapper[4825]: I1007 19:13:07.373685 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpfkd\" (UniqueName: \"kubernetes.io/projected/e49fd630-5fe7-4b4a-a455-9f53417191bf-kube-api-access-zpfkd\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t\" (UID: \"e49fd630-5fe7-4b4a-a455-9f53417191bf\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t" Oct 07 19:13:07 crc kubenswrapper[4825]: I1007 19:13:07.373766 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e49fd630-5fe7-4b4a-a455-9f53417191bf-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t\" (UID: \"e49fd630-5fe7-4b4a-a455-9f53417191bf\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t" Oct 07 19:13:07 crc kubenswrapper[4825]: I1007 19:13:07.373797 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e49fd630-5fe7-4b4a-a455-9f53417191bf-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t\" (UID: \"e49fd630-5fe7-4b4a-a455-9f53417191bf\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t" Oct 07 19:13:07 crc kubenswrapper[4825]: I1007 19:13:07.374386 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e49fd630-5fe7-4b4a-a455-9f53417191bf-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t\" (UID: \"e49fd630-5fe7-4b4a-a455-9f53417191bf\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t" Oct 07 19:13:07 crc kubenswrapper[4825]: I1007 19:13:07.374312 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e49fd630-5fe7-4b4a-a455-9f53417191bf-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t\" (UID: \"e49fd630-5fe7-4b4a-a455-9f53417191bf\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t" Oct 07 19:13:07 crc kubenswrapper[4825]: I1007 19:13:07.405758 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpfkd\" (UniqueName: \"kubernetes.io/projected/e49fd630-5fe7-4b4a-a455-9f53417191bf-kube-api-access-zpfkd\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t\" (UID: \"e49fd630-5fe7-4b4a-a455-9f53417191bf\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t" Oct 07 19:13:07 crc kubenswrapper[4825]: I1007 19:13:07.488996 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t" Oct 07 19:13:07 crc kubenswrapper[4825]: I1007 19:13:07.940224 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t"] Oct 07 19:13:07 crc kubenswrapper[4825]: W1007 19:13:07.948095 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode49fd630_5fe7_4b4a_a455_9f53417191bf.slice/crio-8ed88bb238e0f1c9d2e498fb8610cb8e2d1d6f3eda1d08fa7f2c4380360293d0 WatchSource:0}: Error finding container 8ed88bb238e0f1c9d2e498fb8610cb8e2d1d6f3eda1d08fa7f2c4380360293d0: Status 404 returned error can't find the container with id 8ed88bb238e0f1c9d2e498fb8610cb8e2d1d6f3eda1d08fa7f2c4380360293d0 Oct 07 19:13:08 crc kubenswrapper[4825]: I1007 19:13:08.352419 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-sqfnk" podUID="21bd5368-2631-4c6c-94cf-d6e64b1dd657" containerName="console" containerID="cri-o://9388e160b342b073ab982191a1c3af015f46c42aa95aa44ebe480d51c73ebeb1" gracePeriod=15 Oct 07 19:13:08 crc kubenswrapper[4825]: I1007 19:13:08.399373 4825 generic.go:334] "Generic (PLEG): container finished" podID="e49fd630-5fe7-4b4a-a455-9f53417191bf" containerID="339e9711596513d103cb1ac0e6bb18f526b2918255f7e50e9f2bac058bfccc21" exitCode=0 Oct 07 19:13:08 crc kubenswrapper[4825]: I1007 19:13:08.399419 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t" event={"ID":"e49fd630-5fe7-4b4a-a455-9f53417191bf","Type":"ContainerDied","Data":"339e9711596513d103cb1ac0e6bb18f526b2918255f7e50e9f2bac058bfccc21"} Oct 07 19:13:08 crc kubenswrapper[4825]: I1007 19:13:08.399449 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t" event={"ID":"e49fd630-5fe7-4b4a-a455-9f53417191bf","Type":"ContainerStarted","Data":"8ed88bb238e0f1c9d2e498fb8610cb8e2d1d6f3eda1d08fa7f2c4380360293d0"} Oct 07 19:13:08 crc kubenswrapper[4825]: I1007 19:13:08.819506 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-sqfnk_21bd5368-2631-4c6c-94cf-d6e64b1dd657/console/0.log" Oct 07 19:13:08 crc kubenswrapper[4825]: I1007 19:13:08.819599 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.004517 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/21bd5368-2631-4c6c-94cf-d6e64b1dd657-oauth-serving-cert\") pod \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.004632 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/21bd5368-2631-4c6c-94cf-d6e64b1dd657-service-ca\") pod \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.004734 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/21bd5368-2631-4c6c-94cf-d6e64b1dd657-console-serving-cert\") pod \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.004777 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/21bd5368-2631-4c6c-94cf-d6e64b1dd657-console-config\") pod \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.004831 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/21bd5368-2631-4c6c-94cf-d6e64b1dd657-console-oauth-config\") pod \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.004894 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvljb\" (UniqueName: \"kubernetes.io/projected/21bd5368-2631-4c6c-94cf-d6e64b1dd657-kube-api-access-fvljb\") pod \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.004952 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21bd5368-2631-4c6c-94cf-d6e64b1dd657-trusted-ca-bundle\") pod \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\" (UID: \"21bd5368-2631-4c6c-94cf-d6e64b1dd657\") " Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.005985 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21bd5368-2631-4c6c-94cf-d6e64b1dd657-service-ca" (OuterVolumeSpecName: "service-ca") pod "21bd5368-2631-4c6c-94cf-d6e64b1dd657" (UID: "21bd5368-2631-4c6c-94cf-d6e64b1dd657"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.006053 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21bd5368-2631-4c6c-94cf-d6e64b1dd657-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "21bd5368-2631-4c6c-94cf-d6e64b1dd657" (UID: "21bd5368-2631-4c6c-94cf-d6e64b1dd657"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.006153 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21bd5368-2631-4c6c-94cf-d6e64b1dd657-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "21bd5368-2631-4c6c-94cf-d6e64b1dd657" (UID: "21bd5368-2631-4c6c-94cf-d6e64b1dd657"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.006667 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21bd5368-2631-4c6c-94cf-d6e64b1dd657-console-config" (OuterVolumeSpecName: "console-config") pod "21bd5368-2631-4c6c-94cf-d6e64b1dd657" (UID: "21bd5368-2631-4c6c-94cf-d6e64b1dd657"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.013584 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21bd5368-2631-4c6c-94cf-d6e64b1dd657-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "21bd5368-2631-4c6c-94cf-d6e64b1dd657" (UID: "21bd5368-2631-4c6c-94cf-d6e64b1dd657"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.014177 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21bd5368-2631-4c6c-94cf-d6e64b1dd657-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "21bd5368-2631-4c6c-94cf-d6e64b1dd657" (UID: "21bd5368-2631-4c6c-94cf-d6e64b1dd657"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.014375 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21bd5368-2631-4c6c-94cf-d6e64b1dd657-kube-api-access-fvljb" (OuterVolumeSpecName: "kube-api-access-fvljb") pod "21bd5368-2631-4c6c-94cf-d6e64b1dd657" (UID: "21bd5368-2631-4c6c-94cf-d6e64b1dd657"). InnerVolumeSpecName "kube-api-access-fvljb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.107114 4825 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/21bd5368-2631-4c6c-94cf-d6e64b1dd657-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.107187 4825 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/21bd5368-2631-4c6c-94cf-d6e64b1dd657-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.107218 4825 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/21bd5368-2631-4c6c-94cf-d6e64b1dd657-console-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.107289 4825 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/21bd5368-2631-4c6c-94cf-d6e64b1dd657-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.107317 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvljb\" (UniqueName: \"kubernetes.io/projected/21bd5368-2631-4c6c-94cf-d6e64b1dd657-kube-api-access-fvljb\") on node \"crc\" DevicePath \"\"" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.107336 4825 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21bd5368-2631-4c6c-94cf-d6e64b1dd657-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.107354 4825 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/21bd5368-2631-4c6c-94cf-d6e64b1dd657-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.409293 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-sqfnk_21bd5368-2631-4c6c-94cf-d6e64b1dd657/console/0.log" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.409380 4825 generic.go:334] "Generic (PLEG): container finished" podID="21bd5368-2631-4c6c-94cf-d6e64b1dd657" containerID="9388e160b342b073ab982191a1c3af015f46c42aa95aa44ebe480d51c73ebeb1" exitCode=2 Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.409435 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sqfnk" event={"ID":"21bd5368-2631-4c6c-94cf-d6e64b1dd657","Type":"ContainerDied","Data":"9388e160b342b073ab982191a1c3af015f46c42aa95aa44ebe480d51c73ebeb1"} Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.409474 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sqfnk" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.409494 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sqfnk" event={"ID":"21bd5368-2631-4c6c-94cf-d6e64b1dd657","Type":"ContainerDied","Data":"e7f4a4e2020b010b2adfee9ce29f148bb32ef260d9b150fb02524dd668d07e56"} Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.409526 4825 scope.go:117] "RemoveContainer" containerID="9388e160b342b073ab982191a1c3af015f46c42aa95aa44ebe480d51c73ebeb1" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.430170 4825 scope.go:117] "RemoveContainer" containerID="9388e160b342b073ab982191a1c3af015f46c42aa95aa44ebe480d51c73ebeb1" Oct 07 19:13:09 crc kubenswrapper[4825]: E1007 19:13:09.430691 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9388e160b342b073ab982191a1c3af015f46c42aa95aa44ebe480d51c73ebeb1\": container with ID starting with 9388e160b342b073ab982191a1c3af015f46c42aa95aa44ebe480d51c73ebeb1 not found: ID does not exist" containerID="9388e160b342b073ab982191a1c3af015f46c42aa95aa44ebe480d51c73ebeb1" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.430729 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9388e160b342b073ab982191a1c3af015f46c42aa95aa44ebe480d51c73ebeb1"} err="failed to get container status \"9388e160b342b073ab982191a1c3af015f46c42aa95aa44ebe480d51c73ebeb1\": rpc error: code = NotFound desc = could not find container \"9388e160b342b073ab982191a1c3af015f46c42aa95aa44ebe480d51c73ebeb1\": container with ID starting with 9388e160b342b073ab982191a1c3af015f46c42aa95aa44ebe480d51c73ebeb1 not found: ID does not exist" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.446576 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-sqfnk"] Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.451274 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-sqfnk"] Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.471263 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7txs2"] Oct 07 19:13:09 crc kubenswrapper[4825]: E1007 19:13:09.471508 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21bd5368-2631-4c6c-94cf-d6e64b1dd657" containerName="console" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.471527 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="21bd5368-2631-4c6c-94cf-d6e64b1dd657" containerName="console" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.471642 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="21bd5368-2631-4c6c-94cf-d6e64b1dd657" containerName="console" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.474029 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7txs2" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.490220 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7txs2"] Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.614947 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8230ee67-80cd-4dbc-b1af-383c43bcd30c-catalog-content\") pod \"redhat-operators-7txs2\" (UID: \"8230ee67-80cd-4dbc-b1af-383c43bcd30c\") " pod="openshift-marketplace/redhat-operators-7txs2" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.615123 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8230ee67-80cd-4dbc-b1af-383c43bcd30c-utilities\") pod \"redhat-operators-7txs2\" (UID: \"8230ee67-80cd-4dbc-b1af-383c43bcd30c\") " pod="openshift-marketplace/redhat-operators-7txs2" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.615213 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd5h8\" (UniqueName: \"kubernetes.io/projected/8230ee67-80cd-4dbc-b1af-383c43bcd30c-kube-api-access-xd5h8\") pod \"redhat-operators-7txs2\" (UID: \"8230ee67-80cd-4dbc-b1af-383c43bcd30c\") " pod="openshift-marketplace/redhat-operators-7txs2" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.716561 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8230ee67-80cd-4dbc-b1af-383c43bcd30c-utilities\") pod \"redhat-operators-7txs2\" (UID: \"8230ee67-80cd-4dbc-b1af-383c43bcd30c\") " pod="openshift-marketplace/redhat-operators-7txs2" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.716874 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd5h8\" (UniqueName: \"kubernetes.io/projected/8230ee67-80cd-4dbc-b1af-383c43bcd30c-kube-api-access-xd5h8\") pod \"redhat-operators-7txs2\" (UID: \"8230ee67-80cd-4dbc-b1af-383c43bcd30c\") " pod="openshift-marketplace/redhat-operators-7txs2" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.716916 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8230ee67-80cd-4dbc-b1af-383c43bcd30c-catalog-content\") pod \"redhat-operators-7txs2\" (UID: \"8230ee67-80cd-4dbc-b1af-383c43bcd30c\") " pod="openshift-marketplace/redhat-operators-7txs2" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.717269 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8230ee67-80cd-4dbc-b1af-383c43bcd30c-utilities\") pod \"redhat-operators-7txs2\" (UID: \"8230ee67-80cd-4dbc-b1af-383c43bcd30c\") " pod="openshift-marketplace/redhat-operators-7txs2" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.717305 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8230ee67-80cd-4dbc-b1af-383c43bcd30c-catalog-content\") pod \"redhat-operators-7txs2\" (UID: \"8230ee67-80cd-4dbc-b1af-383c43bcd30c\") " pod="openshift-marketplace/redhat-operators-7txs2" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.744418 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd5h8\" (UniqueName: \"kubernetes.io/projected/8230ee67-80cd-4dbc-b1af-383c43bcd30c-kube-api-access-xd5h8\") pod \"redhat-operators-7txs2\" (UID: \"8230ee67-80cd-4dbc-b1af-383c43bcd30c\") " pod="openshift-marketplace/redhat-operators-7txs2" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.803720 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7txs2" Oct 07 19:13:09 crc kubenswrapper[4825]: I1007 19:13:09.808691 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21bd5368-2631-4c6c-94cf-d6e64b1dd657" path="/var/lib/kubelet/pods/21bd5368-2631-4c6c-94cf-d6e64b1dd657/volumes" Oct 07 19:13:10 crc kubenswrapper[4825]: I1007 19:13:10.220563 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7txs2"] Oct 07 19:13:10 crc kubenswrapper[4825]: I1007 19:13:10.416815 4825 generic.go:334] "Generic (PLEG): container finished" podID="e49fd630-5fe7-4b4a-a455-9f53417191bf" containerID="e0823c87699e21b52b2813070cfecb9a42a8d870e558ac774516720b34280cac" exitCode=0 Oct 07 19:13:10 crc kubenswrapper[4825]: I1007 19:13:10.416872 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t" event={"ID":"e49fd630-5fe7-4b4a-a455-9f53417191bf","Type":"ContainerDied","Data":"e0823c87699e21b52b2813070cfecb9a42a8d870e558ac774516720b34280cac"} Oct 07 19:13:10 crc kubenswrapper[4825]: I1007 19:13:10.421653 4825 generic.go:334] "Generic (PLEG): container finished" podID="8230ee67-80cd-4dbc-b1af-383c43bcd30c" containerID="3a879ddd18214f7e263e319b8d481fc81330592b38fa10d747d83c0ea0193641" exitCode=0 Oct 07 19:13:10 crc kubenswrapper[4825]: I1007 19:13:10.421677 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7txs2" event={"ID":"8230ee67-80cd-4dbc-b1af-383c43bcd30c","Type":"ContainerDied","Data":"3a879ddd18214f7e263e319b8d481fc81330592b38fa10d747d83c0ea0193641"} Oct 07 19:13:10 crc kubenswrapper[4825]: I1007 19:13:10.421690 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7txs2" event={"ID":"8230ee67-80cd-4dbc-b1af-383c43bcd30c","Type":"ContainerStarted","Data":"bbe808d8dad9ebda44cdd9fe5a5d7f2b206eefc0df8c6917b4faa4fb87ce9999"} Oct 07 19:13:11 crc kubenswrapper[4825]: I1007 19:13:11.430782 4825 generic.go:334] "Generic (PLEG): container finished" podID="e49fd630-5fe7-4b4a-a455-9f53417191bf" containerID="062cd820d4194ba1e07b3ac3319c89a239b5dfe9a86a5f19a2d75c02491afc9a" exitCode=0 Oct 07 19:13:11 crc kubenswrapper[4825]: I1007 19:13:11.430874 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t" event={"ID":"e49fd630-5fe7-4b4a-a455-9f53417191bf","Type":"ContainerDied","Data":"062cd820d4194ba1e07b3ac3319c89a239b5dfe9a86a5f19a2d75c02491afc9a"} Oct 07 19:13:11 crc kubenswrapper[4825]: I1007 19:13:11.435460 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7txs2" event={"ID":"8230ee67-80cd-4dbc-b1af-383c43bcd30c","Type":"ContainerStarted","Data":"bf9bcb52dd6266f452d4d33f197f12ceb40e70523696ae9389a697fb4a81654d"} Oct 07 19:13:12 crc kubenswrapper[4825]: I1007 19:13:12.445123 4825 generic.go:334] "Generic (PLEG): container finished" podID="8230ee67-80cd-4dbc-b1af-383c43bcd30c" containerID="bf9bcb52dd6266f452d4d33f197f12ceb40e70523696ae9389a697fb4a81654d" exitCode=0 Oct 07 19:13:12 crc kubenswrapper[4825]: I1007 19:13:12.445474 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7txs2" event={"ID":"8230ee67-80cd-4dbc-b1af-383c43bcd30c","Type":"ContainerDied","Data":"bf9bcb52dd6266f452d4d33f197f12ceb40e70523696ae9389a697fb4a81654d"} Oct 07 19:13:12 crc kubenswrapper[4825]: I1007 19:13:12.828703 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t" Oct 07 19:13:12 crc kubenswrapper[4825]: I1007 19:13:12.964950 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e49fd630-5fe7-4b4a-a455-9f53417191bf-util\") pod \"e49fd630-5fe7-4b4a-a455-9f53417191bf\" (UID: \"e49fd630-5fe7-4b4a-a455-9f53417191bf\") " Oct 07 19:13:12 crc kubenswrapper[4825]: I1007 19:13:12.965039 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpfkd\" (UniqueName: \"kubernetes.io/projected/e49fd630-5fe7-4b4a-a455-9f53417191bf-kube-api-access-zpfkd\") pod \"e49fd630-5fe7-4b4a-a455-9f53417191bf\" (UID: \"e49fd630-5fe7-4b4a-a455-9f53417191bf\") " Oct 07 19:13:12 crc kubenswrapper[4825]: I1007 19:13:12.965084 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e49fd630-5fe7-4b4a-a455-9f53417191bf-bundle\") pod \"e49fd630-5fe7-4b4a-a455-9f53417191bf\" (UID: \"e49fd630-5fe7-4b4a-a455-9f53417191bf\") " Oct 07 19:13:12 crc kubenswrapper[4825]: I1007 19:13:12.966789 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e49fd630-5fe7-4b4a-a455-9f53417191bf-bundle" (OuterVolumeSpecName: "bundle") pod "e49fd630-5fe7-4b4a-a455-9f53417191bf" (UID: "e49fd630-5fe7-4b4a-a455-9f53417191bf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:13:12 crc kubenswrapper[4825]: I1007 19:13:12.971072 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e49fd630-5fe7-4b4a-a455-9f53417191bf-kube-api-access-zpfkd" (OuterVolumeSpecName: "kube-api-access-zpfkd") pod "e49fd630-5fe7-4b4a-a455-9f53417191bf" (UID: "e49fd630-5fe7-4b4a-a455-9f53417191bf"). InnerVolumeSpecName "kube-api-access-zpfkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:13:12 crc kubenswrapper[4825]: I1007 19:13:12.977160 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e49fd630-5fe7-4b4a-a455-9f53417191bf-util" (OuterVolumeSpecName: "util") pod "e49fd630-5fe7-4b4a-a455-9f53417191bf" (UID: "e49fd630-5fe7-4b4a-a455-9f53417191bf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:13:13 crc kubenswrapper[4825]: I1007 19:13:13.066521 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpfkd\" (UniqueName: \"kubernetes.io/projected/e49fd630-5fe7-4b4a-a455-9f53417191bf-kube-api-access-zpfkd\") on node \"crc\" DevicePath \"\"" Oct 07 19:13:13 crc kubenswrapper[4825]: I1007 19:13:13.066842 4825 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e49fd630-5fe7-4b4a-a455-9f53417191bf-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:13:13 crc kubenswrapper[4825]: I1007 19:13:13.066854 4825 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e49fd630-5fe7-4b4a-a455-9f53417191bf-util\") on node \"crc\" DevicePath \"\"" Oct 07 19:13:13 crc kubenswrapper[4825]: I1007 19:13:13.456715 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t" Oct 07 19:13:13 crc kubenswrapper[4825]: I1007 19:13:13.456722 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t" event={"ID":"e49fd630-5fe7-4b4a-a455-9f53417191bf","Type":"ContainerDied","Data":"8ed88bb238e0f1c9d2e498fb8610cb8e2d1d6f3eda1d08fa7f2c4380360293d0"} Oct 07 19:13:13 crc kubenswrapper[4825]: I1007 19:13:13.456803 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ed88bb238e0f1c9d2e498fb8610cb8e2d1d6f3eda1d08fa7f2c4380360293d0" Oct 07 19:13:13 crc kubenswrapper[4825]: I1007 19:13:13.459840 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7txs2" event={"ID":"8230ee67-80cd-4dbc-b1af-383c43bcd30c","Type":"ContainerStarted","Data":"257e325a88c35406dce611c6985b3c7718bbd7ae74780b9790219bb67da5921e"} Oct 07 19:13:13 crc kubenswrapper[4825]: I1007 19:13:13.484967 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7txs2" podStartSLOduration=1.935935623 podStartE2EDuration="4.48494947s" podCreationTimestamp="2025-10-07 19:13:09 +0000 UTC" firstStartedPulling="2025-10-07 19:13:10.423189384 +0000 UTC m=+779.245228031" lastFinishedPulling="2025-10-07 19:13:12.972203201 +0000 UTC m=+781.794241878" observedRunningTime="2025-10-07 19:13:13.48121723 +0000 UTC m=+782.303255907" watchObservedRunningTime="2025-10-07 19:13:13.48494947 +0000 UTC m=+782.306988107" Oct 07 19:13:15 crc kubenswrapper[4825]: I1007 19:13:15.077504 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xtjct"] Oct 07 19:13:15 crc kubenswrapper[4825]: E1007 19:13:15.077810 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e49fd630-5fe7-4b4a-a455-9f53417191bf" containerName="util" Oct 07 19:13:15 crc kubenswrapper[4825]: I1007 19:13:15.077827 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49fd630-5fe7-4b4a-a455-9f53417191bf" containerName="util" Oct 07 19:13:15 crc kubenswrapper[4825]: E1007 19:13:15.077838 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e49fd630-5fe7-4b4a-a455-9f53417191bf" containerName="pull" Oct 07 19:13:15 crc kubenswrapper[4825]: I1007 19:13:15.077844 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49fd630-5fe7-4b4a-a455-9f53417191bf" containerName="pull" Oct 07 19:13:15 crc kubenswrapper[4825]: E1007 19:13:15.077858 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e49fd630-5fe7-4b4a-a455-9f53417191bf" containerName="extract" Oct 07 19:13:15 crc kubenswrapper[4825]: I1007 19:13:15.077864 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49fd630-5fe7-4b4a-a455-9f53417191bf" containerName="extract" Oct 07 19:13:15 crc kubenswrapper[4825]: I1007 19:13:15.077974 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e49fd630-5fe7-4b4a-a455-9f53417191bf" containerName="extract" Oct 07 19:13:15 crc kubenswrapper[4825]: I1007 19:13:15.078936 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xtjct" Oct 07 19:13:15 crc kubenswrapper[4825]: I1007 19:13:15.097083 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xtjct"] Oct 07 19:13:15 crc kubenswrapper[4825]: I1007 19:13:15.214537 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/989bf63e-9986-4d13-b374-a3188212cbcd-utilities\") pod \"certified-operators-xtjct\" (UID: \"989bf63e-9986-4d13-b374-a3188212cbcd\") " pod="openshift-marketplace/certified-operators-xtjct" Oct 07 19:13:15 crc kubenswrapper[4825]: I1007 19:13:15.214603 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/989bf63e-9986-4d13-b374-a3188212cbcd-catalog-content\") pod \"certified-operators-xtjct\" (UID: \"989bf63e-9986-4d13-b374-a3188212cbcd\") " pod="openshift-marketplace/certified-operators-xtjct" Oct 07 19:13:15 crc kubenswrapper[4825]: I1007 19:13:15.214660 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cwz2\" (UniqueName: \"kubernetes.io/projected/989bf63e-9986-4d13-b374-a3188212cbcd-kube-api-access-4cwz2\") pod \"certified-operators-xtjct\" (UID: \"989bf63e-9986-4d13-b374-a3188212cbcd\") " pod="openshift-marketplace/certified-operators-xtjct" Oct 07 19:13:15 crc kubenswrapper[4825]: I1007 19:13:15.316952 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/989bf63e-9986-4d13-b374-a3188212cbcd-catalog-content\") pod \"certified-operators-xtjct\" (UID: \"989bf63e-9986-4d13-b374-a3188212cbcd\") " pod="openshift-marketplace/certified-operators-xtjct" Oct 07 19:13:15 crc kubenswrapper[4825]: I1007 19:13:15.317060 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cwz2\" (UniqueName: \"kubernetes.io/projected/989bf63e-9986-4d13-b374-a3188212cbcd-kube-api-access-4cwz2\") pod \"certified-operators-xtjct\" (UID: \"989bf63e-9986-4d13-b374-a3188212cbcd\") " pod="openshift-marketplace/certified-operators-xtjct" Oct 07 19:13:15 crc kubenswrapper[4825]: I1007 19:13:15.317122 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/989bf63e-9986-4d13-b374-a3188212cbcd-utilities\") pod \"certified-operators-xtjct\" (UID: \"989bf63e-9986-4d13-b374-a3188212cbcd\") " pod="openshift-marketplace/certified-operators-xtjct" Oct 07 19:13:15 crc kubenswrapper[4825]: I1007 19:13:15.317600 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/989bf63e-9986-4d13-b374-a3188212cbcd-catalog-content\") pod \"certified-operators-xtjct\" (UID: \"989bf63e-9986-4d13-b374-a3188212cbcd\") " pod="openshift-marketplace/certified-operators-xtjct" Oct 07 19:13:15 crc kubenswrapper[4825]: I1007 19:13:15.317719 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/989bf63e-9986-4d13-b374-a3188212cbcd-utilities\") pod \"certified-operators-xtjct\" (UID: \"989bf63e-9986-4d13-b374-a3188212cbcd\") " pod="openshift-marketplace/certified-operators-xtjct" Oct 07 19:13:15 crc kubenswrapper[4825]: I1007 19:13:15.341351 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cwz2\" (UniqueName: \"kubernetes.io/projected/989bf63e-9986-4d13-b374-a3188212cbcd-kube-api-access-4cwz2\") pod \"certified-operators-xtjct\" (UID: \"989bf63e-9986-4d13-b374-a3188212cbcd\") " pod="openshift-marketplace/certified-operators-xtjct" Oct 07 19:13:15 crc kubenswrapper[4825]: I1007 19:13:15.411331 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xtjct" Oct 07 19:13:15 crc kubenswrapper[4825]: I1007 19:13:15.842798 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xtjct"] Oct 07 19:13:15 crc kubenswrapper[4825]: W1007 19:13:15.848390 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod989bf63e_9986_4d13_b374_a3188212cbcd.slice/crio-0af64068b8bb731b38fb28a1ab3deed28763332a470663f7630641998e6aab8e WatchSource:0}: Error finding container 0af64068b8bb731b38fb28a1ab3deed28763332a470663f7630641998e6aab8e: Status 404 returned error can't find the container with id 0af64068b8bb731b38fb28a1ab3deed28763332a470663f7630641998e6aab8e Oct 07 19:13:16 crc kubenswrapper[4825]: I1007 19:13:16.483753 4825 generic.go:334] "Generic (PLEG): container finished" podID="989bf63e-9986-4d13-b374-a3188212cbcd" containerID="5c2e9311f81cceaa4dbadafe31c3b303f1f3036b619a432fc356bf5cd392bfac" exitCode=0 Oct 07 19:13:16 crc kubenswrapper[4825]: I1007 19:13:16.483816 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xtjct" event={"ID":"989bf63e-9986-4d13-b374-a3188212cbcd","Type":"ContainerDied","Data":"5c2e9311f81cceaa4dbadafe31c3b303f1f3036b619a432fc356bf5cd392bfac"} Oct 07 19:13:16 crc kubenswrapper[4825]: I1007 19:13:16.483846 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xtjct" event={"ID":"989bf63e-9986-4d13-b374-a3188212cbcd","Type":"ContainerStarted","Data":"0af64068b8bb731b38fb28a1ab3deed28763332a470663f7630641998e6aab8e"} Oct 07 19:13:18 crc kubenswrapper[4825]: I1007 19:13:18.495881 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xtjct" event={"ID":"989bf63e-9986-4d13-b374-a3188212cbcd","Type":"ContainerStarted","Data":"c86f5d2c81ee45c01243d0b15b00716c33ec37af1df501b6e2b2c3c656963b77"} Oct 07 19:13:19 crc kubenswrapper[4825]: I1007 19:13:19.804021 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7txs2" Oct 07 19:13:19 crc kubenswrapper[4825]: I1007 19:13:19.804423 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7txs2" Oct 07 19:13:19 crc kubenswrapper[4825]: I1007 19:13:19.862668 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7txs2" Oct 07 19:13:20 crc kubenswrapper[4825]: I1007 19:13:20.479685 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8fdc2"] Oct 07 19:13:20 crc kubenswrapper[4825]: I1007 19:13:20.481328 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8fdc2" Oct 07 19:13:20 crc kubenswrapper[4825]: I1007 19:13:20.490336 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8fdc2"] Oct 07 19:13:20 crc kubenswrapper[4825]: I1007 19:13:20.507536 4825 generic.go:334] "Generic (PLEG): container finished" podID="989bf63e-9986-4d13-b374-a3188212cbcd" containerID="c86f5d2c81ee45c01243d0b15b00716c33ec37af1df501b6e2b2c3c656963b77" exitCode=0 Oct 07 19:13:20 crc kubenswrapper[4825]: I1007 19:13:20.507606 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xtjct" event={"ID":"989bf63e-9986-4d13-b374-a3188212cbcd","Type":"ContainerDied","Data":"c86f5d2c81ee45c01243d0b15b00716c33ec37af1df501b6e2b2c3c656963b77"} Oct 07 19:13:20 crc kubenswrapper[4825]: I1007 19:13:20.555792 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7txs2" Oct 07 19:13:20 crc kubenswrapper[4825]: I1007 19:13:20.587855 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e43e163f-78f4-48a7-b53a-a762f06322c1-catalog-content\") pod \"redhat-marketplace-8fdc2\" (UID: \"e43e163f-78f4-48a7-b53a-a762f06322c1\") " pod="openshift-marketplace/redhat-marketplace-8fdc2" Oct 07 19:13:20 crc kubenswrapper[4825]: I1007 19:13:20.588172 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e43e163f-78f4-48a7-b53a-a762f06322c1-utilities\") pod \"redhat-marketplace-8fdc2\" (UID: \"e43e163f-78f4-48a7-b53a-a762f06322c1\") " pod="openshift-marketplace/redhat-marketplace-8fdc2" Oct 07 19:13:20 crc kubenswrapper[4825]: I1007 19:13:20.588345 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlqzx\" (UniqueName: \"kubernetes.io/projected/e43e163f-78f4-48a7-b53a-a762f06322c1-kube-api-access-xlqzx\") pod \"redhat-marketplace-8fdc2\" (UID: \"e43e163f-78f4-48a7-b53a-a762f06322c1\") " pod="openshift-marketplace/redhat-marketplace-8fdc2" Oct 07 19:13:20 crc kubenswrapper[4825]: I1007 19:13:20.689129 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlqzx\" (UniqueName: \"kubernetes.io/projected/e43e163f-78f4-48a7-b53a-a762f06322c1-kube-api-access-xlqzx\") pod \"redhat-marketplace-8fdc2\" (UID: \"e43e163f-78f4-48a7-b53a-a762f06322c1\") " pod="openshift-marketplace/redhat-marketplace-8fdc2" Oct 07 19:13:20 crc kubenswrapper[4825]: I1007 19:13:20.689190 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e43e163f-78f4-48a7-b53a-a762f06322c1-catalog-content\") pod \"redhat-marketplace-8fdc2\" (UID: \"e43e163f-78f4-48a7-b53a-a762f06322c1\") " pod="openshift-marketplace/redhat-marketplace-8fdc2" Oct 07 19:13:20 crc kubenswrapper[4825]: I1007 19:13:20.689244 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e43e163f-78f4-48a7-b53a-a762f06322c1-utilities\") pod \"redhat-marketplace-8fdc2\" (UID: \"e43e163f-78f4-48a7-b53a-a762f06322c1\") " pod="openshift-marketplace/redhat-marketplace-8fdc2" Oct 07 19:13:20 crc kubenswrapper[4825]: I1007 19:13:20.689673 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e43e163f-78f4-48a7-b53a-a762f06322c1-utilities\") pod \"redhat-marketplace-8fdc2\" (UID: \"e43e163f-78f4-48a7-b53a-a762f06322c1\") " pod="openshift-marketplace/redhat-marketplace-8fdc2" Oct 07 19:13:20 crc kubenswrapper[4825]: I1007 19:13:20.689767 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e43e163f-78f4-48a7-b53a-a762f06322c1-catalog-content\") pod \"redhat-marketplace-8fdc2\" (UID: \"e43e163f-78f4-48a7-b53a-a762f06322c1\") " pod="openshift-marketplace/redhat-marketplace-8fdc2" Oct 07 19:13:20 crc kubenswrapper[4825]: I1007 19:13:20.708205 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlqzx\" (UniqueName: \"kubernetes.io/projected/e43e163f-78f4-48a7-b53a-a762f06322c1-kube-api-access-xlqzx\") pod \"redhat-marketplace-8fdc2\" (UID: \"e43e163f-78f4-48a7-b53a-a762f06322c1\") " pod="openshift-marketplace/redhat-marketplace-8fdc2" Oct 07 19:13:20 crc kubenswrapper[4825]: I1007 19:13:20.842780 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8fdc2" Oct 07 19:13:21 crc kubenswrapper[4825]: I1007 19:13:21.307651 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8fdc2"] Oct 07 19:13:21 crc kubenswrapper[4825]: W1007 19:13:21.319367 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode43e163f_78f4_48a7_b53a_a762f06322c1.slice/crio-72215bd639a09c32dd48c0e686b997124945167ae08092affadb420f259c3bfe WatchSource:0}: Error finding container 72215bd639a09c32dd48c0e686b997124945167ae08092affadb420f259c3bfe: Status 404 returned error can't find the container with id 72215bd639a09c32dd48c0e686b997124945167ae08092affadb420f259c3bfe Oct 07 19:13:21 crc kubenswrapper[4825]: I1007 19:13:21.514515 4825 generic.go:334] "Generic (PLEG): container finished" podID="e43e163f-78f4-48a7-b53a-a762f06322c1" containerID="e8579aced55968420b2dcbca98c2fab9d9d6323f2461435fa5174d3bfa25b276" exitCode=0 Oct 07 19:13:21 crc kubenswrapper[4825]: I1007 19:13:21.514583 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8fdc2" event={"ID":"e43e163f-78f4-48a7-b53a-a762f06322c1","Type":"ContainerDied","Data":"e8579aced55968420b2dcbca98c2fab9d9d6323f2461435fa5174d3bfa25b276"} Oct 07 19:13:21 crc kubenswrapper[4825]: I1007 19:13:21.514837 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8fdc2" event={"ID":"e43e163f-78f4-48a7-b53a-a762f06322c1","Type":"ContainerStarted","Data":"72215bd639a09c32dd48c0e686b997124945167ae08092affadb420f259c3bfe"} Oct 07 19:13:22 crc kubenswrapper[4825]: I1007 19:13:22.522819 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8fdc2" event={"ID":"e43e163f-78f4-48a7-b53a-a762f06322c1","Type":"ContainerStarted","Data":"b1cd8e99d0ecfa510d1509681f62e0ac8b6652c01a48390cfa909aa9112faf87"} Oct 07 19:13:22 crc kubenswrapper[4825]: I1007 19:13:22.524893 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xtjct" event={"ID":"989bf63e-9986-4d13-b374-a3188212cbcd","Type":"ContainerStarted","Data":"772c7a9774cbcabc45f4003680615045c503d5590f14cfb940ff3d2c92a67264"} Oct 07 19:13:22 crc kubenswrapper[4825]: I1007 19:13:22.560041 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xtjct" podStartSLOduration=2.512290835 podStartE2EDuration="7.560018673s" podCreationTimestamp="2025-10-07 19:13:15 +0000 UTC" firstStartedPulling="2025-10-07 19:13:16.485337721 +0000 UTC m=+785.307376358" lastFinishedPulling="2025-10-07 19:13:21.533065559 +0000 UTC m=+790.355104196" observedRunningTime="2025-10-07 19:13:22.553894806 +0000 UTC m=+791.375933443" watchObservedRunningTime="2025-10-07 19:13:22.560018673 +0000 UTC m=+791.382057310" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.532013 4825 generic.go:334] "Generic (PLEG): container finished" podID="e43e163f-78f4-48a7-b53a-a762f06322c1" containerID="b1cd8e99d0ecfa510d1509681f62e0ac8b6652c01a48390cfa909aa9112faf87" exitCode=0 Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.532055 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8fdc2" event={"ID":"e43e163f-78f4-48a7-b53a-a762f06322c1","Type":"ContainerDied","Data":"b1cd8e99d0ecfa510d1509681f62e0ac8b6652c01a48390cfa909aa9112faf87"} Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.545455 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-bb67dff7d-fcd7m"] Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.546102 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-bb67dff7d-fcd7m" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.551833 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.552189 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.552265 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.552314 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.552895 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-5clc8" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.564824 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-bb67dff7d-fcd7m"] Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.629909 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9rh4\" (UniqueName: \"kubernetes.io/projected/90e787f1-6fb5-4827-b024-89aeb27ca750-kube-api-access-h9rh4\") pod \"metallb-operator-controller-manager-bb67dff7d-fcd7m\" (UID: \"90e787f1-6fb5-4827-b024-89aeb27ca750\") " pod="metallb-system/metallb-operator-controller-manager-bb67dff7d-fcd7m" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.630085 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/90e787f1-6fb5-4827-b024-89aeb27ca750-webhook-cert\") pod \"metallb-operator-controller-manager-bb67dff7d-fcd7m\" (UID: \"90e787f1-6fb5-4827-b024-89aeb27ca750\") " pod="metallb-system/metallb-operator-controller-manager-bb67dff7d-fcd7m" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.630154 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/90e787f1-6fb5-4827-b024-89aeb27ca750-apiservice-cert\") pod \"metallb-operator-controller-manager-bb67dff7d-fcd7m\" (UID: \"90e787f1-6fb5-4827-b024-89aeb27ca750\") " pod="metallb-system/metallb-operator-controller-manager-bb67dff7d-fcd7m" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.731494 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/90e787f1-6fb5-4827-b024-89aeb27ca750-apiservice-cert\") pod \"metallb-operator-controller-manager-bb67dff7d-fcd7m\" (UID: \"90e787f1-6fb5-4827-b024-89aeb27ca750\") " pod="metallb-system/metallb-operator-controller-manager-bb67dff7d-fcd7m" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.731570 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9rh4\" (UniqueName: \"kubernetes.io/projected/90e787f1-6fb5-4827-b024-89aeb27ca750-kube-api-access-h9rh4\") pod \"metallb-operator-controller-manager-bb67dff7d-fcd7m\" (UID: \"90e787f1-6fb5-4827-b024-89aeb27ca750\") " pod="metallb-system/metallb-operator-controller-manager-bb67dff7d-fcd7m" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.731633 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/90e787f1-6fb5-4827-b024-89aeb27ca750-webhook-cert\") pod \"metallb-operator-controller-manager-bb67dff7d-fcd7m\" (UID: \"90e787f1-6fb5-4827-b024-89aeb27ca750\") " pod="metallb-system/metallb-operator-controller-manager-bb67dff7d-fcd7m" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.746199 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/90e787f1-6fb5-4827-b024-89aeb27ca750-webhook-cert\") pod \"metallb-operator-controller-manager-bb67dff7d-fcd7m\" (UID: \"90e787f1-6fb5-4827-b024-89aeb27ca750\") " pod="metallb-system/metallb-operator-controller-manager-bb67dff7d-fcd7m" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.751965 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/90e787f1-6fb5-4827-b024-89aeb27ca750-apiservice-cert\") pod \"metallb-operator-controller-manager-bb67dff7d-fcd7m\" (UID: \"90e787f1-6fb5-4827-b024-89aeb27ca750\") " pod="metallb-system/metallb-operator-controller-manager-bb67dff7d-fcd7m" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.757538 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9rh4\" (UniqueName: \"kubernetes.io/projected/90e787f1-6fb5-4827-b024-89aeb27ca750-kube-api-access-h9rh4\") pod \"metallb-operator-controller-manager-bb67dff7d-fcd7m\" (UID: \"90e787f1-6fb5-4827-b024-89aeb27ca750\") " pod="metallb-system/metallb-operator-controller-manager-bb67dff7d-fcd7m" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.803201 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7c9df698c8-5bgs4"] Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.804271 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7c9df698c8-5bgs4" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.806654 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.807942 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.812294 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-9vhwj" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.821090 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7c9df698c8-5bgs4"] Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.832517 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/51adc395-c4fb-43b7-a152-871a4b65a832-webhook-cert\") pod \"metallb-operator-webhook-server-7c9df698c8-5bgs4\" (UID: \"51adc395-c4fb-43b7-a152-871a4b65a832\") " pod="metallb-system/metallb-operator-webhook-server-7c9df698c8-5bgs4" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.832573 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7nf4\" (UniqueName: \"kubernetes.io/projected/51adc395-c4fb-43b7-a152-871a4b65a832-kube-api-access-k7nf4\") pod \"metallb-operator-webhook-server-7c9df698c8-5bgs4\" (UID: \"51adc395-c4fb-43b7-a152-871a4b65a832\") " pod="metallb-system/metallb-operator-webhook-server-7c9df698c8-5bgs4" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.832599 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/51adc395-c4fb-43b7-a152-871a4b65a832-apiservice-cert\") pod \"metallb-operator-webhook-server-7c9df698c8-5bgs4\" (UID: \"51adc395-c4fb-43b7-a152-871a4b65a832\") " pod="metallb-system/metallb-operator-webhook-server-7c9df698c8-5bgs4" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.860563 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-bb67dff7d-fcd7m" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.933759 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/51adc395-c4fb-43b7-a152-871a4b65a832-webhook-cert\") pod \"metallb-operator-webhook-server-7c9df698c8-5bgs4\" (UID: \"51adc395-c4fb-43b7-a152-871a4b65a832\") " pod="metallb-system/metallb-operator-webhook-server-7c9df698c8-5bgs4" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.933822 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7nf4\" (UniqueName: \"kubernetes.io/projected/51adc395-c4fb-43b7-a152-871a4b65a832-kube-api-access-k7nf4\") pod \"metallb-operator-webhook-server-7c9df698c8-5bgs4\" (UID: \"51adc395-c4fb-43b7-a152-871a4b65a832\") " pod="metallb-system/metallb-operator-webhook-server-7c9df698c8-5bgs4" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.933843 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/51adc395-c4fb-43b7-a152-871a4b65a832-apiservice-cert\") pod \"metallb-operator-webhook-server-7c9df698c8-5bgs4\" (UID: \"51adc395-c4fb-43b7-a152-871a4b65a832\") " pod="metallb-system/metallb-operator-webhook-server-7c9df698c8-5bgs4" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.938843 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/51adc395-c4fb-43b7-a152-871a4b65a832-apiservice-cert\") pod \"metallb-operator-webhook-server-7c9df698c8-5bgs4\" (UID: \"51adc395-c4fb-43b7-a152-871a4b65a832\") " pod="metallb-system/metallb-operator-webhook-server-7c9df698c8-5bgs4" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.957255 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/51adc395-c4fb-43b7-a152-871a4b65a832-webhook-cert\") pod \"metallb-operator-webhook-server-7c9df698c8-5bgs4\" (UID: \"51adc395-c4fb-43b7-a152-871a4b65a832\") " pod="metallb-system/metallb-operator-webhook-server-7c9df698c8-5bgs4" Oct 07 19:13:23 crc kubenswrapper[4825]: I1007 19:13:23.970087 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7nf4\" (UniqueName: \"kubernetes.io/projected/51adc395-c4fb-43b7-a152-871a4b65a832-kube-api-access-k7nf4\") pod \"metallb-operator-webhook-server-7c9df698c8-5bgs4\" (UID: \"51adc395-c4fb-43b7-a152-871a4b65a832\") " pod="metallb-system/metallb-operator-webhook-server-7c9df698c8-5bgs4" Oct 07 19:13:24 crc kubenswrapper[4825]: I1007 19:13:24.120629 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7c9df698c8-5bgs4" Oct 07 19:13:24 crc kubenswrapper[4825]: I1007 19:13:24.262407 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7txs2"] Oct 07 19:13:24 crc kubenswrapper[4825]: I1007 19:13:24.262643 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7txs2" podUID="8230ee67-80cd-4dbc-b1af-383c43bcd30c" containerName="registry-server" containerID="cri-o://257e325a88c35406dce611c6985b3c7718bbd7ae74780b9790219bb67da5921e" gracePeriod=2 Oct 07 19:13:24 crc kubenswrapper[4825]: I1007 19:13:24.330319 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-bb67dff7d-fcd7m"] Oct 07 19:13:24 crc kubenswrapper[4825]: I1007 19:13:24.377272 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7c9df698c8-5bgs4"] Oct 07 19:13:24 crc kubenswrapper[4825]: W1007 19:13:24.416450 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51adc395_c4fb_43b7_a152_871a4b65a832.slice/crio-3a36cf410eaf53dde62eb794882ac979b43a9d369a904689a2567bbe6089bd82 WatchSource:0}: Error finding container 3a36cf410eaf53dde62eb794882ac979b43a9d369a904689a2567bbe6089bd82: Status 404 returned error can't find the container with id 3a36cf410eaf53dde62eb794882ac979b43a9d369a904689a2567bbe6089bd82 Oct 07 19:13:24 crc kubenswrapper[4825]: I1007 19:13:24.537840 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-bb67dff7d-fcd7m" event={"ID":"90e787f1-6fb5-4827-b024-89aeb27ca750","Type":"ContainerStarted","Data":"9cb99af6579518d1636658105828525047c46411e2e7501f81b70a2bb56a0604"} Oct 07 19:13:24 crc kubenswrapper[4825]: I1007 19:13:24.540533 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8fdc2" event={"ID":"e43e163f-78f4-48a7-b53a-a762f06322c1","Type":"ContainerStarted","Data":"0bd5ac408c7aa23df29ed8cf77d1ec0ea4ab35571e19ac0fae8c1e2b0b8aac3b"} Oct 07 19:13:24 crc kubenswrapper[4825]: I1007 19:13:24.543965 4825 generic.go:334] "Generic (PLEG): container finished" podID="8230ee67-80cd-4dbc-b1af-383c43bcd30c" containerID="257e325a88c35406dce611c6985b3c7718bbd7ae74780b9790219bb67da5921e" exitCode=0 Oct 07 19:13:24 crc kubenswrapper[4825]: I1007 19:13:24.544021 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7txs2" event={"ID":"8230ee67-80cd-4dbc-b1af-383c43bcd30c","Type":"ContainerDied","Data":"257e325a88c35406dce611c6985b3c7718bbd7ae74780b9790219bb67da5921e"} Oct 07 19:13:24 crc kubenswrapper[4825]: I1007 19:13:24.545968 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7c9df698c8-5bgs4" event={"ID":"51adc395-c4fb-43b7-a152-871a4b65a832","Type":"ContainerStarted","Data":"3a36cf410eaf53dde62eb794882ac979b43a9d369a904689a2567bbe6089bd82"} Oct 07 19:13:24 crc kubenswrapper[4825]: I1007 19:13:24.560427 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8fdc2" podStartSLOduration=2.121453954 podStartE2EDuration="4.560412245s" podCreationTimestamp="2025-10-07 19:13:20 +0000 UTC" firstStartedPulling="2025-10-07 19:13:21.531392776 +0000 UTC m=+790.353431413" lastFinishedPulling="2025-10-07 19:13:23.970351067 +0000 UTC m=+792.792389704" observedRunningTime="2025-10-07 19:13:24.556953864 +0000 UTC m=+793.378992501" watchObservedRunningTime="2025-10-07 19:13:24.560412245 +0000 UTC m=+793.382450882" Oct 07 19:13:24 crc kubenswrapper[4825]: I1007 19:13:24.638337 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7txs2" Oct 07 19:13:24 crc kubenswrapper[4825]: I1007 19:13:24.741726 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8230ee67-80cd-4dbc-b1af-383c43bcd30c-catalog-content\") pod \"8230ee67-80cd-4dbc-b1af-383c43bcd30c\" (UID: \"8230ee67-80cd-4dbc-b1af-383c43bcd30c\") " Oct 07 19:13:24 crc kubenswrapper[4825]: I1007 19:13:24.741784 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd5h8\" (UniqueName: \"kubernetes.io/projected/8230ee67-80cd-4dbc-b1af-383c43bcd30c-kube-api-access-xd5h8\") pod \"8230ee67-80cd-4dbc-b1af-383c43bcd30c\" (UID: \"8230ee67-80cd-4dbc-b1af-383c43bcd30c\") " Oct 07 19:13:24 crc kubenswrapper[4825]: I1007 19:13:24.741814 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8230ee67-80cd-4dbc-b1af-383c43bcd30c-utilities\") pod \"8230ee67-80cd-4dbc-b1af-383c43bcd30c\" (UID: \"8230ee67-80cd-4dbc-b1af-383c43bcd30c\") " Oct 07 19:13:24 crc kubenswrapper[4825]: I1007 19:13:24.742591 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8230ee67-80cd-4dbc-b1af-383c43bcd30c-utilities" (OuterVolumeSpecName: "utilities") pod "8230ee67-80cd-4dbc-b1af-383c43bcd30c" (UID: "8230ee67-80cd-4dbc-b1af-383c43bcd30c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:13:24 crc kubenswrapper[4825]: I1007 19:13:24.746472 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8230ee67-80cd-4dbc-b1af-383c43bcd30c-kube-api-access-xd5h8" (OuterVolumeSpecName: "kube-api-access-xd5h8") pod "8230ee67-80cd-4dbc-b1af-383c43bcd30c" (UID: "8230ee67-80cd-4dbc-b1af-383c43bcd30c"). InnerVolumeSpecName "kube-api-access-xd5h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:13:24 crc kubenswrapper[4825]: I1007 19:13:24.832294 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8230ee67-80cd-4dbc-b1af-383c43bcd30c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8230ee67-80cd-4dbc-b1af-383c43bcd30c" (UID: "8230ee67-80cd-4dbc-b1af-383c43bcd30c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:13:24 crc kubenswrapper[4825]: I1007 19:13:24.843255 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8230ee67-80cd-4dbc-b1af-383c43bcd30c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:13:24 crc kubenswrapper[4825]: I1007 19:13:24.843293 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd5h8\" (UniqueName: \"kubernetes.io/projected/8230ee67-80cd-4dbc-b1af-383c43bcd30c-kube-api-access-xd5h8\") on node \"crc\" DevicePath \"\"" Oct 07 19:13:24 crc kubenswrapper[4825]: I1007 19:13:24.843306 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8230ee67-80cd-4dbc-b1af-383c43bcd30c-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:13:25 crc kubenswrapper[4825]: I1007 19:13:25.412118 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xtjct" Oct 07 19:13:25 crc kubenswrapper[4825]: I1007 19:13:25.412215 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xtjct" Oct 07 19:13:25 crc kubenswrapper[4825]: I1007 19:13:25.476378 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xtjct" Oct 07 19:13:25 crc kubenswrapper[4825]: I1007 19:13:25.555768 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7txs2" event={"ID":"8230ee67-80cd-4dbc-b1af-383c43bcd30c","Type":"ContainerDied","Data":"bbe808d8dad9ebda44cdd9fe5a5d7f2b206eefc0df8c6917b4faa4fb87ce9999"} Oct 07 19:13:25 crc kubenswrapper[4825]: I1007 19:13:25.555856 4825 scope.go:117] "RemoveContainer" containerID="257e325a88c35406dce611c6985b3c7718bbd7ae74780b9790219bb67da5921e" Oct 07 19:13:25 crc kubenswrapper[4825]: I1007 19:13:25.556009 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7txs2" Oct 07 19:13:25 crc kubenswrapper[4825]: I1007 19:13:25.584997 4825 scope.go:117] "RemoveContainer" containerID="bf9bcb52dd6266f452d4d33f197f12ceb40e70523696ae9389a697fb4a81654d" Oct 07 19:13:25 crc kubenswrapper[4825]: I1007 19:13:25.601749 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7txs2"] Oct 07 19:13:25 crc kubenswrapper[4825]: I1007 19:13:25.605561 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7txs2"] Oct 07 19:13:25 crc kubenswrapper[4825]: I1007 19:13:25.625221 4825 scope.go:117] "RemoveContainer" containerID="3a879ddd18214f7e263e319b8d481fc81330592b38fa10d747d83c0ea0193641" Oct 07 19:13:25 crc kubenswrapper[4825]: I1007 19:13:25.804620 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8230ee67-80cd-4dbc-b1af-383c43bcd30c" path="/var/lib/kubelet/pods/8230ee67-80cd-4dbc-b1af-383c43bcd30c/volumes" Oct 07 19:13:30 crc kubenswrapper[4825]: I1007 19:13:30.844068 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8fdc2" Oct 07 19:13:30 crc kubenswrapper[4825]: I1007 19:13:30.844380 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8fdc2" Oct 07 19:13:30 crc kubenswrapper[4825]: I1007 19:13:30.913162 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8fdc2" Oct 07 19:13:31 crc kubenswrapper[4825]: I1007 19:13:31.589362 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7c9df698c8-5bgs4" event={"ID":"51adc395-c4fb-43b7-a152-871a4b65a832","Type":"ContainerStarted","Data":"867a1e393fbb7a94752e3c85aec61f79ef66c9f738e20b927a5b28673aa19967"} Oct 07 19:13:31 crc kubenswrapper[4825]: I1007 19:13:31.589629 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7c9df698c8-5bgs4" Oct 07 19:13:31 crc kubenswrapper[4825]: I1007 19:13:31.590894 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-bb67dff7d-fcd7m" event={"ID":"90e787f1-6fb5-4827-b024-89aeb27ca750","Type":"ContainerStarted","Data":"8639e2452107b05dfd96d636fc06338b02cb1c04e39b82b3347a0a324d9088b2"} Oct 07 19:13:31 crc kubenswrapper[4825]: I1007 19:13:31.591000 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-bb67dff7d-fcd7m" Oct 07 19:13:31 crc kubenswrapper[4825]: I1007 19:13:31.606652 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7c9df698c8-5bgs4" podStartSLOduration=1.928310999 podStartE2EDuration="8.606632004s" podCreationTimestamp="2025-10-07 19:13:23 +0000 UTC" firstStartedPulling="2025-10-07 19:13:24.419377963 +0000 UTC m=+793.241416600" lastFinishedPulling="2025-10-07 19:13:31.097698968 +0000 UTC m=+799.919737605" observedRunningTime="2025-10-07 19:13:31.605729245 +0000 UTC m=+800.427767882" watchObservedRunningTime="2025-10-07 19:13:31.606632004 +0000 UTC m=+800.428670651" Oct 07 19:13:31 crc kubenswrapper[4825]: I1007 19:13:31.647255 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8fdc2" Oct 07 19:13:31 crc kubenswrapper[4825]: I1007 19:13:31.669444 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-bb67dff7d-fcd7m" podStartSLOduration=1.9317568600000001 podStartE2EDuration="8.669419877s" podCreationTimestamp="2025-10-07 19:13:23 +0000 UTC" firstStartedPulling="2025-10-07 19:13:24.342259291 +0000 UTC m=+793.164297928" lastFinishedPulling="2025-10-07 19:13:31.079922268 +0000 UTC m=+799.901960945" observedRunningTime="2025-10-07 19:13:31.624382543 +0000 UTC m=+800.446421170" watchObservedRunningTime="2025-10-07 19:13:31.669419877 +0000 UTC m=+800.491458524" Oct 07 19:13:33 crc kubenswrapper[4825]: I1007 19:13:33.882043 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2nn66"] Oct 07 19:13:33 crc kubenswrapper[4825]: E1007 19:13:33.882317 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8230ee67-80cd-4dbc-b1af-383c43bcd30c" containerName="extract-content" Oct 07 19:13:33 crc kubenswrapper[4825]: I1007 19:13:33.882332 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8230ee67-80cd-4dbc-b1af-383c43bcd30c" containerName="extract-content" Oct 07 19:13:33 crc kubenswrapper[4825]: E1007 19:13:33.882344 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8230ee67-80cd-4dbc-b1af-383c43bcd30c" containerName="registry-server" Oct 07 19:13:33 crc kubenswrapper[4825]: I1007 19:13:33.882353 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8230ee67-80cd-4dbc-b1af-383c43bcd30c" containerName="registry-server" Oct 07 19:13:33 crc kubenswrapper[4825]: E1007 19:13:33.882371 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8230ee67-80cd-4dbc-b1af-383c43bcd30c" containerName="extract-utilities" Oct 07 19:13:33 crc kubenswrapper[4825]: I1007 19:13:33.882379 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8230ee67-80cd-4dbc-b1af-383c43bcd30c" containerName="extract-utilities" Oct 07 19:13:33 crc kubenswrapper[4825]: I1007 19:13:33.882513 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="8230ee67-80cd-4dbc-b1af-383c43bcd30c" containerName="registry-server" Oct 07 19:13:33 crc kubenswrapper[4825]: I1007 19:13:33.883443 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2nn66" Oct 07 19:13:33 crc kubenswrapper[4825]: I1007 19:13:33.895495 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2nn66"] Oct 07 19:13:34 crc kubenswrapper[4825]: I1007 19:13:34.080106 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b417dd7b-9eb2-41a9-a505-e4b49166d4d2-catalog-content\") pod \"community-operators-2nn66\" (UID: \"b417dd7b-9eb2-41a9-a505-e4b49166d4d2\") " pod="openshift-marketplace/community-operators-2nn66" Oct 07 19:13:34 crc kubenswrapper[4825]: I1007 19:13:34.080186 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b417dd7b-9eb2-41a9-a505-e4b49166d4d2-utilities\") pod \"community-operators-2nn66\" (UID: \"b417dd7b-9eb2-41a9-a505-e4b49166d4d2\") " pod="openshift-marketplace/community-operators-2nn66" Oct 07 19:13:34 crc kubenswrapper[4825]: I1007 19:13:34.080270 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djnft\" (UniqueName: \"kubernetes.io/projected/b417dd7b-9eb2-41a9-a505-e4b49166d4d2-kube-api-access-djnft\") pod \"community-operators-2nn66\" (UID: \"b417dd7b-9eb2-41a9-a505-e4b49166d4d2\") " pod="openshift-marketplace/community-operators-2nn66" Oct 07 19:13:34 crc kubenswrapper[4825]: I1007 19:13:34.181328 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b417dd7b-9eb2-41a9-a505-e4b49166d4d2-utilities\") pod \"community-operators-2nn66\" (UID: \"b417dd7b-9eb2-41a9-a505-e4b49166d4d2\") " pod="openshift-marketplace/community-operators-2nn66" Oct 07 19:13:34 crc kubenswrapper[4825]: I1007 19:13:34.181598 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djnft\" (UniqueName: \"kubernetes.io/projected/b417dd7b-9eb2-41a9-a505-e4b49166d4d2-kube-api-access-djnft\") pod \"community-operators-2nn66\" (UID: \"b417dd7b-9eb2-41a9-a505-e4b49166d4d2\") " pod="openshift-marketplace/community-operators-2nn66" Oct 07 19:13:34 crc kubenswrapper[4825]: I1007 19:13:34.181643 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b417dd7b-9eb2-41a9-a505-e4b49166d4d2-catalog-content\") pod \"community-operators-2nn66\" (UID: \"b417dd7b-9eb2-41a9-a505-e4b49166d4d2\") " pod="openshift-marketplace/community-operators-2nn66" Oct 07 19:13:34 crc kubenswrapper[4825]: I1007 19:13:34.181814 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b417dd7b-9eb2-41a9-a505-e4b49166d4d2-utilities\") pod \"community-operators-2nn66\" (UID: \"b417dd7b-9eb2-41a9-a505-e4b49166d4d2\") " pod="openshift-marketplace/community-operators-2nn66" Oct 07 19:13:34 crc kubenswrapper[4825]: I1007 19:13:34.182007 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b417dd7b-9eb2-41a9-a505-e4b49166d4d2-catalog-content\") pod \"community-operators-2nn66\" (UID: \"b417dd7b-9eb2-41a9-a505-e4b49166d4d2\") " pod="openshift-marketplace/community-operators-2nn66" Oct 07 19:13:34 crc kubenswrapper[4825]: I1007 19:13:34.201182 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djnft\" (UniqueName: \"kubernetes.io/projected/b417dd7b-9eb2-41a9-a505-e4b49166d4d2-kube-api-access-djnft\") pod \"community-operators-2nn66\" (UID: \"b417dd7b-9eb2-41a9-a505-e4b49166d4d2\") " pod="openshift-marketplace/community-operators-2nn66" Oct 07 19:13:34 crc kubenswrapper[4825]: I1007 19:13:34.204812 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2nn66" Oct 07 19:13:34 crc kubenswrapper[4825]: I1007 19:13:34.443580 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2nn66"] Oct 07 19:13:34 crc kubenswrapper[4825]: I1007 19:13:34.618686 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nn66" event={"ID":"b417dd7b-9eb2-41a9-a505-e4b49166d4d2","Type":"ContainerStarted","Data":"f300d707033b9988b0e939e903ab55c59a74635326aa24fd2f45c7c3eafde691"} Oct 07 19:13:34 crc kubenswrapper[4825]: I1007 19:13:34.618981 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nn66" event={"ID":"b417dd7b-9eb2-41a9-a505-e4b49166d4d2","Type":"ContainerStarted","Data":"15b01a464d91acb0c41ceef0d4c54cb0b7bf95d87c2b9aefad0aa0cf27e4220d"} Oct 07 19:13:35 crc kubenswrapper[4825]: I1007 19:13:35.472022 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xtjct" Oct 07 19:13:35 crc kubenswrapper[4825]: I1007 19:13:35.627854 4825 generic.go:334] "Generic (PLEG): container finished" podID="b417dd7b-9eb2-41a9-a505-e4b49166d4d2" containerID="f300d707033b9988b0e939e903ab55c59a74635326aa24fd2f45c7c3eafde691" exitCode=0 Oct 07 19:13:35 crc kubenswrapper[4825]: I1007 19:13:35.627919 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nn66" event={"ID":"b417dd7b-9eb2-41a9-a505-e4b49166d4d2","Type":"ContainerDied","Data":"f300d707033b9988b0e939e903ab55c59a74635326aa24fd2f45c7c3eafde691"} Oct 07 19:13:35 crc kubenswrapper[4825]: I1007 19:13:35.708966 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:13:35 crc kubenswrapper[4825]: I1007 19:13:35.709347 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:13:36 crc kubenswrapper[4825]: I1007 19:13:36.668053 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8fdc2"] Oct 07 19:13:36 crc kubenswrapper[4825]: I1007 19:13:36.668336 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8fdc2" podUID="e43e163f-78f4-48a7-b53a-a762f06322c1" containerName="registry-server" containerID="cri-o://0bd5ac408c7aa23df29ed8cf77d1ec0ea4ab35571e19ac0fae8c1e2b0b8aac3b" gracePeriod=2 Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.274788 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8fdc2" Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.448458 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e43e163f-78f4-48a7-b53a-a762f06322c1-catalog-content\") pod \"e43e163f-78f4-48a7-b53a-a762f06322c1\" (UID: \"e43e163f-78f4-48a7-b53a-a762f06322c1\") " Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.448595 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlqzx\" (UniqueName: \"kubernetes.io/projected/e43e163f-78f4-48a7-b53a-a762f06322c1-kube-api-access-xlqzx\") pod \"e43e163f-78f4-48a7-b53a-a762f06322c1\" (UID: \"e43e163f-78f4-48a7-b53a-a762f06322c1\") " Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.448646 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e43e163f-78f4-48a7-b53a-a762f06322c1-utilities\") pod \"e43e163f-78f4-48a7-b53a-a762f06322c1\" (UID: \"e43e163f-78f4-48a7-b53a-a762f06322c1\") " Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.449768 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e43e163f-78f4-48a7-b53a-a762f06322c1-utilities" (OuterVolumeSpecName: "utilities") pod "e43e163f-78f4-48a7-b53a-a762f06322c1" (UID: "e43e163f-78f4-48a7-b53a-a762f06322c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.455399 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e43e163f-78f4-48a7-b53a-a762f06322c1-kube-api-access-xlqzx" (OuterVolumeSpecName: "kube-api-access-xlqzx") pod "e43e163f-78f4-48a7-b53a-a762f06322c1" (UID: "e43e163f-78f4-48a7-b53a-a762f06322c1"). InnerVolumeSpecName "kube-api-access-xlqzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.485116 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e43e163f-78f4-48a7-b53a-a762f06322c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e43e163f-78f4-48a7-b53a-a762f06322c1" (UID: "e43e163f-78f4-48a7-b53a-a762f06322c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.549936 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlqzx\" (UniqueName: \"kubernetes.io/projected/e43e163f-78f4-48a7-b53a-a762f06322c1-kube-api-access-xlqzx\") on node \"crc\" DevicePath \"\"" Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.549975 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e43e163f-78f4-48a7-b53a-a762f06322c1-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.549989 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e43e163f-78f4-48a7-b53a-a762f06322c1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.646444 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nn66" event={"ID":"b417dd7b-9eb2-41a9-a505-e4b49166d4d2","Type":"ContainerStarted","Data":"2a549d44c78c1eee3e37ee51d7488b69b9163d1e5ef65db579021ed58608af38"} Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.648994 4825 generic.go:334] "Generic (PLEG): container finished" podID="e43e163f-78f4-48a7-b53a-a762f06322c1" containerID="0bd5ac408c7aa23df29ed8cf77d1ec0ea4ab35571e19ac0fae8c1e2b0b8aac3b" exitCode=0 Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.649025 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8fdc2" event={"ID":"e43e163f-78f4-48a7-b53a-a762f06322c1","Type":"ContainerDied","Data":"0bd5ac408c7aa23df29ed8cf77d1ec0ea4ab35571e19ac0fae8c1e2b0b8aac3b"} Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.649042 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8fdc2" event={"ID":"e43e163f-78f4-48a7-b53a-a762f06322c1","Type":"ContainerDied","Data":"72215bd639a09c32dd48c0e686b997124945167ae08092affadb420f259c3bfe"} Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.649070 4825 scope.go:117] "RemoveContainer" containerID="0bd5ac408c7aa23df29ed8cf77d1ec0ea4ab35571e19ac0fae8c1e2b0b8aac3b" Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.649161 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8fdc2" Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.678948 4825 scope.go:117] "RemoveContainer" containerID="b1cd8e99d0ecfa510d1509681f62e0ac8b6652c01a48390cfa909aa9112faf87" Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.710930 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8fdc2"] Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.715188 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8fdc2"] Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.715412 4825 scope.go:117] "RemoveContainer" containerID="e8579aced55968420b2dcbca98c2fab9d9d6323f2461435fa5174d3bfa25b276" Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.745700 4825 scope.go:117] "RemoveContainer" containerID="0bd5ac408c7aa23df29ed8cf77d1ec0ea4ab35571e19ac0fae8c1e2b0b8aac3b" Oct 07 19:13:38 crc kubenswrapper[4825]: E1007 19:13:38.746124 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bd5ac408c7aa23df29ed8cf77d1ec0ea4ab35571e19ac0fae8c1e2b0b8aac3b\": container with ID starting with 0bd5ac408c7aa23df29ed8cf77d1ec0ea4ab35571e19ac0fae8c1e2b0b8aac3b not found: ID does not exist" containerID="0bd5ac408c7aa23df29ed8cf77d1ec0ea4ab35571e19ac0fae8c1e2b0b8aac3b" Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.746156 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bd5ac408c7aa23df29ed8cf77d1ec0ea4ab35571e19ac0fae8c1e2b0b8aac3b"} err="failed to get container status \"0bd5ac408c7aa23df29ed8cf77d1ec0ea4ab35571e19ac0fae8c1e2b0b8aac3b\": rpc error: code = NotFound desc = could not find container \"0bd5ac408c7aa23df29ed8cf77d1ec0ea4ab35571e19ac0fae8c1e2b0b8aac3b\": container with ID starting with 0bd5ac408c7aa23df29ed8cf77d1ec0ea4ab35571e19ac0fae8c1e2b0b8aac3b not found: ID does not exist" Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.746176 4825 scope.go:117] "RemoveContainer" containerID="b1cd8e99d0ecfa510d1509681f62e0ac8b6652c01a48390cfa909aa9112faf87" Oct 07 19:13:38 crc kubenswrapper[4825]: E1007 19:13:38.746616 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1cd8e99d0ecfa510d1509681f62e0ac8b6652c01a48390cfa909aa9112faf87\": container with ID starting with b1cd8e99d0ecfa510d1509681f62e0ac8b6652c01a48390cfa909aa9112faf87 not found: ID does not exist" containerID="b1cd8e99d0ecfa510d1509681f62e0ac8b6652c01a48390cfa909aa9112faf87" Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.746662 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1cd8e99d0ecfa510d1509681f62e0ac8b6652c01a48390cfa909aa9112faf87"} err="failed to get container status \"b1cd8e99d0ecfa510d1509681f62e0ac8b6652c01a48390cfa909aa9112faf87\": rpc error: code = NotFound desc = could not find container \"b1cd8e99d0ecfa510d1509681f62e0ac8b6652c01a48390cfa909aa9112faf87\": container with ID starting with b1cd8e99d0ecfa510d1509681f62e0ac8b6652c01a48390cfa909aa9112faf87 not found: ID does not exist" Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.746716 4825 scope.go:117] "RemoveContainer" containerID="e8579aced55968420b2dcbca98c2fab9d9d6323f2461435fa5174d3bfa25b276" Oct 07 19:13:38 crc kubenswrapper[4825]: E1007 19:13:38.747134 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8579aced55968420b2dcbca98c2fab9d9d6323f2461435fa5174d3bfa25b276\": container with ID starting with e8579aced55968420b2dcbca98c2fab9d9d6323f2461435fa5174d3bfa25b276 not found: ID does not exist" containerID="e8579aced55968420b2dcbca98c2fab9d9d6323f2461435fa5174d3bfa25b276" Oct 07 19:13:38 crc kubenswrapper[4825]: I1007 19:13:38.747155 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8579aced55968420b2dcbca98c2fab9d9d6323f2461435fa5174d3bfa25b276"} err="failed to get container status \"e8579aced55968420b2dcbca98c2fab9d9d6323f2461435fa5174d3bfa25b276\": rpc error: code = NotFound desc = could not find container \"e8579aced55968420b2dcbca98c2fab9d9d6323f2461435fa5174d3bfa25b276\": container with ID starting with e8579aced55968420b2dcbca98c2fab9d9d6323f2461435fa5174d3bfa25b276 not found: ID does not exist" Oct 07 19:13:39 crc kubenswrapper[4825]: I1007 19:13:39.659601 4825 generic.go:334] "Generic (PLEG): container finished" podID="b417dd7b-9eb2-41a9-a505-e4b49166d4d2" containerID="2a549d44c78c1eee3e37ee51d7488b69b9163d1e5ef65db579021ed58608af38" exitCode=0 Oct 07 19:13:39 crc kubenswrapper[4825]: I1007 19:13:39.659664 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nn66" event={"ID":"b417dd7b-9eb2-41a9-a505-e4b49166d4d2","Type":"ContainerDied","Data":"2a549d44c78c1eee3e37ee51d7488b69b9163d1e5ef65db579021ed58608af38"} Oct 07 19:13:39 crc kubenswrapper[4825]: I1007 19:13:39.821161 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e43e163f-78f4-48a7-b53a-a762f06322c1" path="/var/lib/kubelet/pods/e43e163f-78f4-48a7-b53a-a762f06322c1/volumes" Oct 07 19:13:40 crc kubenswrapper[4825]: I1007 19:13:40.267441 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xtjct"] Oct 07 19:13:40 crc kubenswrapper[4825]: I1007 19:13:40.267725 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xtjct" podUID="989bf63e-9986-4d13-b374-a3188212cbcd" containerName="registry-server" containerID="cri-o://772c7a9774cbcabc45f4003680615045c503d5590f14cfb940ff3d2c92a67264" gracePeriod=2 Oct 07 19:13:40 crc kubenswrapper[4825]: I1007 19:13:40.667632 4825 generic.go:334] "Generic (PLEG): container finished" podID="989bf63e-9986-4d13-b374-a3188212cbcd" containerID="772c7a9774cbcabc45f4003680615045c503d5590f14cfb940ff3d2c92a67264" exitCode=0 Oct 07 19:13:40 crc kubenswrapper[4825]: I1007 19:13:40.667720 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xtjct" event={"ID":"989bf63e-9986-4d13-b374-a3188212cbcd","Type":"ContainerDied","Data":"772c7a9774cbcabc45f4003680615045c503d5590f14cfb940ff3d2c92a67264"} Oct 07 19:13:40 crc kubenswrapper[4825]: I1007 19:13:40.724559 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xtjct" Oct 07 19:13:40 crc kubenswrapper[4825]: I1007 19:13:40.879729 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cwz2\" (UniqueName: \"kubernetes.io/projected/989bf63e-9986-4d13-b374-a3188212cbcd-kube-api-access-4cwz2\") pod \"989bf63e-9986-4d13-b374-a3188212cbcd\" (UID: \"989bf63e-9986-4d13-b374-a3188212cbcd\") " Oct 07 19:13:40 crc kubenswrapper[4825]: I1007 19:13:40.879775 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/989bf63e-9986-4d13-b374-a3188212cbcd-catalog-content\") pod \"989bf63e-9986-4d13-b374-a3188212cbcd\" (UID: \"989bf63e-9986-4d13-b374-a3188212cbcd\") " Oct 07 19:13:40 crc kubenswrapper[4825]: I1007 19:13:40.879874 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/989bf63e-9986-4d13-b374-a3188212cbcd-utilities\") pod \"989bf63e-9986-4d13-b374-a3188212cbcd\" (UID: \"989bf63e-9986-4d13-b374-a3188212cbcd\") " Oct 07 19:13:40 crc kubenswrapper[4825]: I1007 19:13:40.880965 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/989bf63e-9986-4d13-b374-a3188212cbcd-utilities" (OuterVolumeSpecName: "utilities") pod "989bf63e-9986-4d13-b374-a3188212cbcd" (UID: "989bf63e-9986-4d13-b374-a3188212cbcd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:13:40 crc kubenswrapper[4825]: I1007 19:13:40.885353 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/989bf63e-9986-4d13-b374-a3188212cbcd-kube-api-access-4cwz2" (OuterVolumeSpecName: "kube-api-access-4cwz2") pod "989bf63e-9986-4d13-b374-a3188212cbcd" (UID: "989bf63e-9986-4d13-b374-a3188212cbcd"). InnerVolumeSpecName "kube-api-access-4cwz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:13:40 crc kubenswrapper[4825]: I1007 19:13:40.935937 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/989bf63e-9986-4d13-b374-a3188212cbcd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "989bf63e-9986-4d13-b374-a3188212cbcd" (UID: "989bf63e-9986-4d13-b374-a3188212cbcd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:13:40 crc kubenswrapper[4825]: I1007 19:13:40.981580 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/989bf63e-9986-4d13-b374-a3188212cbcd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:13:40 crc kubenswrapper[4825]: I1007 19:13:40.981989 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cwz2\" (UniqueName: \"kubernetes.io/projected/989bf63e-9986-4d13-b374-a3188212cbcd-kube-api-access-4cwz2\") on node \"crc\" DevicePath \"\"" Oct 07 19:13:40 crc kubenswrapper[4825]: I1007 19:13:40.982029 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/989bf63e-9986-4d13-b374-a3188212cbcd-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:13:41 crc kubenswrapper[4825]: I1007 19:13:41.679145 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nn66" event={"ID":"b417dd7b-9eb2-41a9-a505-e4b49166d4d2","Type":"ContainerStarted","Data":"f553f86f6d0d18b06a151a478c28d03483f4e1ed9505848116c913bd337a31d5"} Oct 07 19:13:41 crc kubenswrapper[4825]: I1007 19:13:41.682364 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xtjct" event={"ID":"989bf63e-9986-4d13-b374-a3188212cbcd","Type":"ContainerDied","Data":"0af64068b8bb731b38fb28a1ab3deed28763332a470663f7630641998e6aab8e"} Oct 07 19:13:41 crc kubenswrapper[4825]: I1007 19:13:41.682421 4825 scope.go:117] "RemoveContainer" containerID="772c7a9774cbcabc45f4003680615045c503d5590f14cfb940ff3d2c92a67264" Oct 07 19:13:41 crc kubenswrapper[4825]: I1007 19:13:41.682428 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xtjct" Oct 07 19:13:41 crc kubenswrapper[4825]: I1007 19:13:41.702573 4825 scope.go:117] "RemoveContainer" containerID="c86f5d2c81ee45c01243d0b15b00716c33ec37af1df501b6e2b2c3c656963b77" Oct 07 19:13:41 crc kubenswrapper[4825]: I1007 19:13:41.702727 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2nn66" podStartSLOduration=3.867219826 podStartE2EDuration="8.70271538s" podCreationTimestamp="2025-10-07 19:13:33 +0000 UTC" firstStartedPulling="2025-10-07 19:13:35.629970591 +0000 UTC m=+804.452009228" lastFinishedPulling="2025-10-07 19:13:40.465466145 +0000 UTC m=+809.287504782" observedRunningTime="2025-10-07 19:13:41.700257702 +0000 UTC m=+810.522296369" watchObservedRunningTime="2025-10-07 19:13:41.70271538 +0000 UTC m=+810.524754027" Oct 07 19:13:41 crc kubenswrapper[4825]: I1007 19:13:41.716375 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xtjct"] Oct 07 19:13:41 crc kubenswrapper[4825]: I1007 19:13:41.725802 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xtjct"] Oct 07 19:13:41 crc kubenswrapper[4825]: I1007 19:13:41.741307 4825 scope.go:117] "RemoveContainer" containerID="5c2e9311f81cceaa4dbadafe31c3b303f1f3036b619a432fc356bf5cd392bfac" Oct 07 19:13:41 crc kubenswrapper[4825]: I1007 19:13:41.806739 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="989bf63e-9986-4d13-b374-a3188212cbcd" path="/var/lib/kubelet/pods/989bf63e-9986-4d13-b374-a3188212cbcd/volumes" Oct 07 19:13:44 crc kubenswrapper[4825]: I1007 19:13:44.127040 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7c9df698c8-5bgs4" Oct 07 19:13:44 crc kubenswrapper[4825]: I1007 19:13:44.205332 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2nn66" Oct 07 19:13:44 crc kubenswrapper[4825]: I1007 19:13:44.205388 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2nn66" Oct 07 19:13:44 crc kubenswrapper[4825]: I1007 19:13:44.243164 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2nn66" Oct 07 19:13:54 crc kubenswrapper[4825]: I1007 19:13:54.262856 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2nn66" Oct 07 19:13:56 crc kubenswrapper[4825]: I1007 19:13:56.670735 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2nn66"] Oct 07 19:13:56 crc kubenswrapper[4825]: I1007 19:13:56.671577 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2nn66" podUID="b417dd7b-9eb2-41a9-a505-e4b49166d4d2" containerName="registry-server" containerID="cri-o://f553f86f6d0d18b06a151a478c28d03483f4e1ed9505848116c913bd337a31d5" gracePeriod=2 Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.148140 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2nn66" Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.300991 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b417dd7b-9eb2-41a9-a505-e4b49166d4d2-utilities\") pod \"b417dd7b-9eb2-41a9-a505-e4b49166d4d2\" (UID: \"b417dd7b-9eb2-41a9-a505-e4b49166d4d2\") " Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.301057 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djnft\" (UniqueName: \"kubernetes.io/projected/b417dd7b-9eb2-41a9-a505-e4b49166d4d2-kube-api-access-djnft\") pod \"b417dd7b-9eb2-41a9-a505-e4b49166d4d2\" (UID: \"b417dd7b-9eb2-41a9-a505-e4b49166d4d2\") " Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.301165 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b417dd7b-9eb2-41a9-a505-e4b49166d4d2-catalog-content\") pod \"b417dd7b-9eb2-41a9-a505-e4b49166d4d2\" (UID: \"b417dd7b-9eb2-41a9-a505-e4b49166d4d2\") " Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.302761 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b417dd7b-9eb2-41a9-a505-e4b49166d4d2-utilities" (OuterVolumeSpecName: "utilities") pod "b417dd7b-9eb2-41a9-a505-e4b49166d4d2" (UID: "b417dd7b-9eb2-41a9-a505-e4b49166d4d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.310906 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b417dd7b-9eb2-41a9-a505-e4b49166d4d2-kube-api-access-djnft" (OuterVolumeSpecName: "kube-api-access-djnft") pod "b417dd7b-9eb2-41a9-a505-e4b49166d4d2" (UID: "b417dd7b-9eb2-41a9-a505-e4b49166d4d2"). InnerVolumeSpecName "kube-api-access-djnft". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.363744 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b417dd7b-9eb2-41a9-a505-e4b49166d4d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b417dd7b-9eb2-41a9-a505-e4b49166d4d2" (UID: "b417dd7b-9eb2-41a9-a505-e4b49166d4d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.403026 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b417dd7b-9eb2-41a9-a505-e4b49166d4d2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.403101 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b417dd7b-9eb2-41a9-a505-e4b49166d4d2-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.403128 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djnft\" (UniqueName: \"kubernetes.io/projected/b417dd7b-9eb2-41a9-a505-e4b49166d4d2-kube-api-access-djnft\") on node \"crc\" DevicePath \"\"" Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.793087 4825 generic.go:334] "Generic (PLEG): container finished" podID="b417dd7b-9eb2-41a9-a505-e4b49166d4d2" containerID="f553f86f6d0d18b06a151a478c28d03483f4e1ed9505848116c913bd337a31d5" exitCode=0 Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.793159 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nn66" event={"ID":"b417dd7b-9eb2-41a9-a505-e4b49166d4d2","Type":"ContainerDied","Data":"f553f86f6d0d18b06a151a478c28d03483f4e1ed9505848116c913bd337a31d5"} Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.793183 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2nn66" Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.793208 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nn66" event={"ID":"b417dd7b-9eb2-41a9-a505-e4b49166d4d2","Type":"ContainerDied","Data":"15b01a464d91acb0c41ceef0d4c54cb0b7bf95d87c2b9aefad0aa0cf27e4220d"} Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.793258 4825 scope.go:117] "RemoveContainer" containerID="f553f86f6d0d18b06a151a478c28d03483f4e1ed9505848116c913bd337a31d5" Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.821382 4825 scope.go:117] "RemoveContainer" containerID="2a549d44c78c1eee3e37ee51d7488b69b9163d1e5ef65db579021ed58608af38" Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.849472 4825 scope.go:117] "RemoveContainer" containerID="f300d707033b9988b0e939e903ab55c59a74635326aa24fd2f45c7c3eafde691" Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.849575 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2nn66"] Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.854692 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2nn66"] Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.887994 4825 scope.go:117] "RemoveContainer" containerID="f553f86f6d0d18b06a151a478c28d03483f4e1ed9505848116c913bd337a31d5" Oct 07 19:13:57 crc kubenswrapper[4825]: E1007 19:13:57.889038 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f553f86f6d0d18b06a151a478c28d03483f4e1ed9505848116c913bd337a31d5\": container with ID starting with f553f86f6d0d18b06a151a478c28d03483f4e1ed9505848116c913bd337a31d5 not found: ID does not exist" containerID="f553f86f6d0d18b06a151a478c28d03483f4e1ed9505848116c913bd337a31d5" Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.889098 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f553f86f6d0d18b06a151a478c28d03483f4e1ed9505848116c913bd337a31d5"} err="failed to get container status \"f553f86f6d0d18b06a151a478c28d03483f4e1ed9505848116c913bd337a31d5\": rpc error: code = NotFound desc = could not find container \"f553f86f6d0d18b06a151a478c28d03483f4e1ed9505848116c913bd337a31d5\": container with ID starting with f553f86f6d0d18b06a151a478c28d03483f4e1ed9505848116c913bd337a31d5 not found: ID does not exist" Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.889131 4825 scope.go:117] "RemoveContainer" containerID="2a549d44c78c1eee3e37ee51d7488b69b9163d1e5ef65db579021ed58608af38" Oct 07 19:13:57 crc kubenswrapper[4825]: E1007 19:13:57.889598 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a549d44c78c1eee3e37ee51d7488b69b9163d1e5ef65db579021ed58608af38\": container with ID starting with 2a549d44c78c1eee3e37ee51d7488b69b9163d1e5ef65db579021ed58608af38 not found: ID does not exist" containerID="2a549d44c78c1eee3e37ee51d7488b69b9163d1e5ef65db579021ed58608af38" Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.889673 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a549d44c78c1eee3e37ee51d7488b69b9163d1e5ef65db579021ed58608af38"} err="failed to get container status \"2a549d44c78c1eee3e37ee51d7488b69b9163d1e5ef65db579021ed58608af38\": rpc error: code = NotFound desc = could not find container \"2a549d44c78c1eee3e37ee51d7488b69b9163d1e5ef65db579021ed58608af38\": container with ID starting with 2a549d44c78c1eee3e37ee51d7488b69b9163d1e5ef65db579021ed58608af38 not found: ID does not exist" Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.889722 4825 scope.go:117] "RemoveContainer" containerID="f300d707033b9988b0e939e903ab55c59a74635326aa24fd2f45c7c3eafde691" Oct 07 19:13:57 crc kubenswrapper[4825]: E1007 19:13:57.890243 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f300d707033b9988b0e939e903ab55c59a74635326aa24fd2f45c7c3eafde691\": container with ID starting with f300d707033b9988b0e939e903ab55c59a74635326aa24fd2f45c7c3eafde691 not found: ID does not exist" containerID="f300d707033b9988b0e939e903ab55c59a74635326aa24fd2f45c7c3eafde691" Oct 07 19:13:57 crc kubenswrapper[4825]: I1007 19:13:57.890280 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f300d707033b9988b0e939e903ab55c59a74635326aa24fd2f45c7c3eafde691"} err="failed to get container status \"f300d707033b9988b0e939e903ab55c59a74635326aa24fd2f45c7c3eafde691\": rpc error: code = NotFound desc = could not find container \"f300d707033b9988b0e939e903ab55c59a74635326aa24fd2f45c7c3eafde691\": container with ID starting with f300d707033b9988b0e939e903ab55c59a74635326aa24fd2f45c7c3eafde691 not found: ID does not exist" Oct 07 19:13:59 crc kubenswrapper[4825]: I1007 19:13:59.814167 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b417dd7b-9eb2-41a9-a505-e4b49166d4d2" path="/var/lib/kubelet/pods/b417dd7b-9eb2-41a9-a505-e4b49166d4d2/volumes" Oct 07 19:14:03 crc kubenswrapper[4825]: I1007 19:14:03.863687 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-bb67dff7d-fcd7m" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.658821 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-h9q88"] Oct 07 19:14:04 crc kubenswrapper[4825]: E1007 19:14:04.659381 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e43e163f-78f4-48a7-b53a-a762f06322c1" containerName="extract-utilities" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.659487 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e43e163f-78f4-48a7-b53a-a762f06322c1" containerName="extract-utilities" Oct 07 19:14:04 crc kubenswrapper[4825]: E1007 19:14:04.659560 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b417dd7b-9eb2-41a9-a505-e4b49166d4d2" containerName="extract-utilities" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.659625 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b417dd7b-9eb2-41a9-a505-e4b49166d4d2" containerName="extract-utilities" Oct 07 19:14:04 crc kubenswrapper[4825]: E1007 19:14:04.659702 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="989bf63e-9986-4d13-b374-a3188212cbcd" containerName="extract-content" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.659772 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="989bf63e-9986-4d13-b374-a3188212cbcd" containerName="extract-content" Oct 07 19:14:04 crc kubenswrapper[4825]: E1007 19:14:04.659862 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b417dd7b-9eb2-41a9-a505-e4b49166d4d2" containerName="registry-server" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.659972 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b417dd7b-9eb2-41a9-a505-e4b49166d4d2" containerName="registry-server" Oct 07 19:14:04 crc kubenswrapper[4825]: E1007 19:14:04.660090 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="989bf63e-9986-4d13-b374-a3188212cbcd" containerName="registry-server" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.660172 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="989bf63e-9986-4d13-b374-a3188212cbcd" containerName="registry-server" Oct 07 19:14:04 crc kubenswrapper[4825]: E1007 19:14:04.660282 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e43e163f-78f4-48a7-b53a-a762f06322c1" containerName="extract-content" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.660356 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e43e163f-78f4-48a7-b53a-a762f06322c1" containerName="extract-content" Oct 07 19:14:04 crc kubenswrapper[4825]: E1007 19:14:04.660439 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="989bf63e-9986-4d13-b374-a3188212cbcd" containerName="extract-utilities" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.660506 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="989bf63e-9986-4d13-b374-a3188212cbcd" containerName="extract-utilities" Oct 07 19:14:04 crc kubenswrapper[4825]: E1007 19:14:04.660583 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b417dd7b-9eb2-41a9-a505-e4b49166d4d2" containerName="extract-content" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.660652 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b417dd7b-9eb2-41a9-a505-e4b49166d4d2" containerName="extract-content" Oct 07 19:14:04 crc kubenswrapper[4825]: E1007 19:14:04.660722 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e43e163f-78f4-48a7-b53a-a762f06322c1" containerName="registry-server" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.660793 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e43e163f-78f4-48a7-b53a-a762f06322c1" containerName="registry-server" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.661011 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b417dd7b-9eb2-41a9-a505-e4b49166d4d2" containerName="registry-server" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.661097 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="989bf63e-9986-4d13-b374-a3188212cbcd" containerName="registry-server" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.661299 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e43e163f-78f4-48a7-b53a-a762f06322c1" containerName="registry-server" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.663787 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.667389 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.667731 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-smfcb" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.668790 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.679897 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-sd7s7"] Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.680969 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sd7s7" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.682811 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.695167 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-sd7s7"] Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.759049 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-x5hwq"] Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.760177 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-x5hwq" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.762864 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.762874 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.763112 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.768461 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-nknr8" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.806456 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-6vp5x"] Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.807795 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-6vp5x" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.812876 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.824290 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-6vp5x"] Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.830598 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15d8205a-b357-40f1-813d-e42c9d6ac2f0-cert\") pod \"frr-k8s-webhook-server-64bf5d555-sd7s7\" (UID: \"15d8205a-b357-40f1-813d-e42c9d6ac2f0\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sd7s7" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.830674 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d7287c8b-6db9-4ec7-b7a0-52fd36aec363-metrics\") pod \"frr-k8s-h9q88\" (UID: \"d7287c8b-6db9-4ec7-b7a0-52fd36aec363\") " pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.830715 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d7287c8b-6db9-4ec7-b7a0-52fd36aec363-frr-sockets\") pod \"frr-k8s-h9q88\" (UID: \"d7287c8b-6db9-4ec7-b7a0-52fd36aec363\") " pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.830759 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6whw\" (UniqueName: \"kubernetes.io/projected/15d8205a-b357-40f1-813d-e42c9d6ac2f0-kube-api-access-c6whw\") pod \"frr-k8s-webhook-server-64bf5d555-sd7s7\" (UID: \"15d8205a-b357-40f1-813d-e42c9d6ac2f0\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sd7s7" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.830797 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d7287c8b-6db9-4ec7-b7a0-52fd36aec363-frr-conf\") pod \"frr-k8s-h9q88\" (UID: \"d7287c8b-6db9-4ec7-b7a0-52fd36aec363\") " pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.830818 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d7287c8b-6db9-4ec7-b7a0-52fd36aec363-reloader\") pod \"frr-k8s-h9q88\" (UID: \"d7287c8b-6db9-4ec7-b7a0-52fd36aec363\") " pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.831068 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7287c8b-6db9-4ec7-b7a0-52fd36aec363-metrics-certs\") pod \"frr-k8s-h9q88\" (UID: \"d7287c8b-6db9-4ec7-b7a0-52fd36aec363\") " pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.831117 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d7287c8b-6db9-4ec7-b7a0-52fd36aec363-frr-startup\") pod \"frr-k8s-h9q88\" (UID: \"d7287c8b-6db9-4ec7-b7a0-52fd36aec363\") " pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.831179 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2ppv\" (UniqueName: \"kubernetes.io/projected/d7287c8b-6db9-4ec7-b7a0-52fd36aec363-kube-api-access-h2ppv\") pod \"frr-k8s-h9q88\" (UID: \"d7287c8b-6db9-4ec7-b7a0-52fd36aec363\") " pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.936355 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6whw\" (UniqueName: \"kubernetes.io/projected/15d8205a-b357-40f1-813d-e42c9d6ac2f0-kube-api-access-c6whw\") pod \"frr-k8s-webhook-server-64bf5d555-sd7s7\" (UID: \"15d8205a-b357-40f1-813d-e42c9d6ac2f0\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sd7s7" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.937362 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d7287c8b-6db9-4ec7-b7a0-52fd36aec363-frr-conf\") pod \"frr-k8s-h9q88\" (UID: \"d7287c8b-6db9-4ec7-b7a0-52fd36aec363\") " pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.937486 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/e705456a-fdcd-4d7e-b3e9-0146cf587db8-metallb-excludel2\") pod \"speaker-x5hwq\" (UID: \"e705456a-fdcd-4d7e-b3e9-0146cf587db8\") " pod="metallb-system/speaker-x5hwq" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.937577 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d7287c8b-6db9-4ec7-b7a0-52fd36aec363-reloader\") pod \"frr-k8s-h9q88\" (UID: \"d7287c8b-6db9-4ec7-b7a0-52fd36aec363\") " pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.937676 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nx2p\" (UniqueName: \"kubernetes.io/projected/5c658b5b-d9ac-4877-894a-770c7fefcf5e-kube-api-access-4nx2p\") pod \"controller-68d546b9d8-6vp5x\" (UID: \"5c658b5b-d9ac-4877-894a-770c7fefcf5e\") " pod="metallb-system/controller-68d546b9d8-6vp5x" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.937751 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e705456a-fdcd-4d7e-b3e9-0146cf587db8-metrics-certs\") pod \"speaker-x5hwq\" (UID: \"e705456a-fdcd-4d7e-b3e9-0146cf587db8\") " pod="metallb-system/speaker-x5hwq" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.937832 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c658b5b-d9ac-4877-894a-770c7fefcf5e-cert\") pod \"controller-68d546b9d8-6vp5x\" (UID: \"5c658b5b-d9ac-4877-894a-770c7fefcf5e\") " pod="metallb-system/controller-68d546b9d8-6vp5x" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.937890 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d7287c8b-6db9-4ec7-b7a0-52fd36aec363-reloader\") pod \"frr-k8s-h9q88\" (UID: \"d7287c8b-6db9-4ec7-b7a0-52fd36aec363\") " pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.937867 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d7287c8b-6db9-4ec7-b7a0-52fd36aec363-frr-conf\") pod \"frr-k8s-h9q88\" (UID: \"d7287c8b-6db9-4ec7-b7a0-52fd36aec363\") " pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.937978 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d7287c8b-6db9-4ec7-b7a0-52fd36aec363-frr-startup\") pod \"frr-k8s-h9q88\" (UID: \"d7287c8b-6db9-4ec7-b7a0-52fd36aec363\") " pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.938042 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7287c8b-6db9-4ec7-b7a0-52fd36aec363-metrics-certs\") pod \"frr-k8s-h9q88\" (UID: \"d7287c8b-6db9-4ec7-b7a0-52fd36aec363\") " pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.938084 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e705456a-fdcd-4d7e-b3e9-0146cf587db8-memberlist\") pod \"speaker-x5hwq\" (UID: \"e705456a-fdcd-4d7e-b3e9-0146cf587db8\") " pod="metallb-system/speaker-x5hwq" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.938121 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2ppv\" (UniqueName: \"kubernetes.io/projected/d7287c8b-6db9-4ec7-b7a0-52fd36aec363-kube-api-access-h2ppv\") pod \"frr-k8s-h9q88\" (UID: \"d7287c8b-6db9-4ec7-b7a0-52fd36aec363\") " pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.938152 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qbl2\" (UniqueName: \"kubernetes.io/projected/e705456a-fdcd-4d7e-b3e9-0146cf587db8-kube-api-access-5qbl2\") pod \"speaker-x5hwq\" (UID: \"e705456a-fdcd-4d7e-b3e9-0146cf587db8\") " pod="metallb-system/speaker-x5hwq" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.938333 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15d8205a-b357-40f1-813d-e42c9d6ac2f0-cert\") pod \"frr-k8s-webhook-server-64bf5d555-sd7s7\" (UID: \"15d8205a-b357-40f1-813d-e42c9d6ac2f0\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sd7s7" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.938386 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c658b5b-d9ac-4877-894a-770c7fefcf5e-metrics-certs\") pod \"controller-68d546b9d8-6vp5x\" (UID: \"5c658b5b-d9ac-4877-894a-770c7fefcf5e\") " pod="metallb-system/controller-68d546b9d8-6vp5x" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.938418 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d7287c8b-6db9-4ec7-b7a0-52fd36aec363-metrics\") pod \"frr-k8s-h9q88\" (UID: \"d7287c8b-6db9-4ec7-b7a0-52fd36aec363\") " pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.938455 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d7287c8b-6db9-4ec7-b7a0-52fd36aec363-frr-sockets\") pod \"frr-k8s-h9q88\" (UID: \"d7287c8b-6db9-4ec7-b7a0-52fd36aec363\") " pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.938857 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d7287c8b-6db9-4ec7-b7a0-52fd36aec363-frr-sockets\") pod \"frr-k8s-h9q88\" (UID: \"d7287c8b-6db9-4ec7-b7a0-52fd36aec363\") " pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.938861 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d7287c8b-6db9-4ec7-b7a0-52fd36aec363-metrics\") pod \"frr-k8s-h9q88\" (UID: \"d7287c8b-6db9-4ec7-b7a0-52fd36aec363\") " pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.938947 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d7287c8b-6db9-4ec7-b7a0-52fd36aec363-frr-startup\") pod \"frr-k8s-h9q88\" (UID: \"d7287c8b-6db9-4ec7-b7a0-52fd36aec363\") " pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.946869 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15d8205a-b357-40f1-813d-e42c9d6ac2f0-cert\") pod \"frr-k8s-webhook-server-64bf5d555-sd7s7\" (UID: \"15d8205a-b357-40f1-813d-e42c9d6ac2f0\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sd7s7" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.949393 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7287c8b-6db9-4ec7-b7a0-52fd36aec363-metrics-certs\") pod \"frr-k8s-h9q88\" (UID: \"d7287c8b-6db9-4ec7-b7a0-52fd36aec363\") " pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.976325 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2ppv\" (UniqueName: \"kubernetes.io/projected/d7287c8b-6db9-4ec7-b7a0-52fd36aec363-kube-api-access-h2ppv\") pod \"frr-k8s-h9q88\" (UID: \"d7287c8b-6db9-4ec7-b7a0-52fd36aec363\") " pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.982440 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6whw\" (UniqueName: \"kubernetes.io/projected/15d8205a-b357-40f1-813d-e42c9d6ac2f0-kube-api-access-c6whw\") pod \"frr-k8s-webhook-server-64bf5d555-sd7s7\" (UID: \"15d8205a-b357-40f1-813d-e42c9d6ac2f0\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sd7s7" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.984464 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:04 crc kubenswrapper[4825]: I1007 19:14:04.996824 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sd7s7" Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.039030 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c658b5b-d9ac-4877-894a-770c7fefcf5e-metrics-certs\") pod \"controller-68d546b9d8-6vp5x\" (UID: \"5c658b5b-d9ac-4877-894a-770c7fefcf5e\") " pod="metallb-system/controller-68d546b9d8-6vp5x" Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.039120 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/e705456a-fdcd-4d7e-b3e9-0146cf587db8-metallb-excludel2\") pod \"speaker-x5hwq\" (UID: \"e705456a-fdcd-4d7e-b3e9-0146cf587db8\") " pod="metallb-system/speaker-x5hwq" Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.039174 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nx2p\" (UniqueName: \"kubernetes.io/projected/5c658b5b-d9ac-4877-894a-770c7fefcf5e-kube-api-access-4nx2p\") pod \"controller-68d546b9d8-6vp5x\" (UID: \"5c658b5b-d9ac-4877-894a-770c7fefcf5e\") " pod="metallb-system/controller-68d546b9d8-6vp5x" Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.039199 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e705456a-fdcd-4d7e-b3e9-0146cf587db8-metrics-certs\") pod \"speaker-x5hwq\" (UID: \"e705456a-fdcd-4d7e-b3e9-0146cf587db8\") " pod="metallb-system/speaker-x5hwq" Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.039220 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c658b5b-d9ac-4877-894a-770c7fefcf5e-cert\") pod \"controller-68d546b9d8-6vp5x\" (UID: \"5c658b5b-d9ac-4877-894a-770c7fefcf5e\") " pod="metallb-system/controller-68d546b9d8-6vp5x" Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.039277 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e705456a-fdcd-4d7e-b3e9-0146cf587db8-memberlist\") pod \"speaker-x5hwq\" (UID: \"e705456a-fdcd-4d7e-b3e9-0146cf587db8\") " pod="metallb-system/speaker-x5hwq" Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.039353 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qbl2\" (UniqueName: \"kubernetes.io/projected/e705456a-fdcd-4d7e-b3e9-0146cf587db8-kube-api-access-5qbl2\") pod \"speaker-x5hwq\" (UID: \"e705456a-fdcd-4d7e-b3e9-0146cf587db8\") " pod="metallb-system/speaker-x5hwq" Oct 07 19:14:05 crc kubenswrapper[4825]: E1007 19:14:05.040563 4825 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 07 19:14:05 crc kubenswrapper[4825]: E1007 19:14:05.040704 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e705456a-fdcd-4d7e-b3e9-0146cf587db8-memberlist podName:e705456a-fdcd-4d7e-b3e9-0146cf587db8 nodeName:}" failed. No retries permitted until 2025-10-07 19:14:05.540650506 +0000 UTC m=+834.362689143 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/e705456a-fdcd-4d7e-b3e9-0146cf587db8-memberlist") pod "speaker-x5hwq" (UID: "e705456a-fdcd-4d7e-b3e9-0146cf587db8") : secret "metallb-memberlist" not found Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.040946 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/e705456a-fdcd-4d7e-b3e9-0146cf587db8-metallb-excludel2\") pod \"speaker-x5hwq\" (UID: \"e705456a-fdcd-4d7e-b3e9-0146cf587db8\") " pod="metallb-system/speaker-x5hwq" Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.044077 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c658b5b-d9ac-4877-894a-770c7fefcf5e-cert\") pod \"controller-68d546b9d8-6vp5x\" (UID: \"5c658b5b-d9ac-4877-894a-770c7fefcf5e\") " pod="metallb-system/controller-68d546b9d8-6vp5x" Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.044608 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e705456a-fdcd-4d7e-b3e9-0146cf587db8-metrics-certs\") pod \"speaker-x5hwq\" (UID: \"e705456a-fdcd-4d7e-b3e9-0146cf587db8\") " pod="metallb-system/speaker-x5hwq" Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.047664 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c658b5b-d9ac-4877-894a-770c7fefcf5e-metrics-certs\") pod \"controller-68d546b9d8-6vp5x\" (UID: \"5c658b5b-d9ac-4877-894a-770c7fefcf5e\") " pod="metallb-system/controller-68d546b9d8-6vp5x" Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.069841 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nx2p\" (UniqueName: \"kubernetes.io/projected/5c658b5b-d9ac-4877-894a-770c7fefcf5e-kube-api-access-4nx2p\") pod \"controller-68d546b9d8-6vp5x\" (UID: \"5c658b5b-d9ac-4877-894a-770c7fefcf5e\") " pod="metallb-system/controller-68d546b9d8-6vp5x" Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.070812 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qbl2\" (UniqueName: \"kubernetes.io/projected/e705456a-fdcd-4d7e-b3e9-0146cf587db8-kube-api-access-5qbl2\") pod \"speaker-x5hwq\" (UID: \"e705456a-fdcd-4d7e-b3e9-0146cf587db8\") " pod="metallb-system/speaker-x5hwq" Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.123668 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-6vp5x" Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.250688 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-sd7s7"] Oct 07 19:14:05 crc kubenswrapper[4825]: W1007 19:14:05.258950 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15d8205a_b357_40f1_813d_e42c9d6ac2f0.slice/crio-3f65611d4bfe492c465b68f551349c156f6204bef96d49ee0295d3275b36cdd2 WatchSource:0}: Error finding container 3f65611d4bfe492c465b68f551349c156f6204bef96d49ee0295d3275b36cdd2: Status 404 returned error can't find the container with id 3f65611d4bfe492c465b68f551349c156f6204bef96d49ee0295d3275b36cdd2 Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.546032 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e705456a-fdcd-4d7e-b3e9-0146cf587db8-memberlist\") pod \"speaker-x5hwq\" (UID: \"e705456a-fdcd-4d7e-b3e9-0146cf587db8\") " pod="metallb-system/speaker-x5hwq" Oct 07 19:14:05 crc kubenswrapper[4825]: E1007 19:14:05.546495 4825 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 07 19:14:05 crc kubenswrapper[4825]: E1007 19:14:05.546699 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e705456a-fdcd-4d7e-b3e9-0146cf587db8-memberlist podName:e705456a-fdcd-4d7e-b3e9-0146cf587db8 nodeName:}" failed. No retries permitted until 2025-10-07 19:14:06.546669769 +0000 UTC m=+835.368708406 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/e705456a-fdcd-4d7e-b3e9-0146cf587db8-memberlist") pod "speaker-x5hwq" (UID: "e705456a-fdcd-4d7e-b3e9-0146cf587db8") : secret "metallb-memberlist" not found Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.548835 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-6vp5x"] Oct 07 19:14:05 crc kubenswrapper[4825]: W1007 19:14:05.555621 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c658b5b_d9ac_4877_894a_770c7fefcf5e.slice/crio-2bb765ed239a2c3fa70688ce8958cfed33b0b5bcb107e183791bfe22d7806e75 WatchSource:0}: Error finding container 2bb765ed239a2c3fa70688ce8958cfed33b0b5bcb107e183791bfe22d7806e75: Status 404 returned error can't find the container with id 2bb765ed239a2c3fa70688ce8958cfed33b0b5bcb107e183791bfe22d7806e75 Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.709141 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.709313 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.709404 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.710577 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1aff985c4d465af81432b2c0fd1da9cb01f3378b2087e04530a854de44547a92"} pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.710681 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" containerID="cri-o://1aff985c4d465af81432b2c0fd1da9cb01f3378b2087e04530a854de44547a92" gracePeriod=600 Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.861998 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-6vp5x" event={"ID":"5c658b5b-d9ac-4877-894a-770c7fefcf5e","Type":"ContainerStarted","Data":"2a2cd4165639a7240aee0368e4c6abe208746fa35d62b569fc92c38a116afe4d"} Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.862079 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-6vp5x" event={"ID":"5c658b5b-d9ac-4877-894a-770c7fefcf5e","Type":"ContainerStarted","Data":"c28b92721c4f5e30c1e6dc6ca5315fe8c56e4833f40a1a6dcf1e83ffceaf9090"} Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.862093 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-6vp5x" event={"ID":"5c658b5b-d9ac-4877-894a-770c7fefcf5e","Type":"ContainerStarted","Data":"2bb765ed239a2c3fa70688ce8958cfed33b0b5bcb107e183791bfe22d7806e75"} Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.862135 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-6vp5x" Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.864847 4825 generic.go:334] "Generic (PLEG): container finished" podID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerID="1aff985c4d465af81432b2c0fd1da9cb01f3378b2087e04530a854de44547a92" exitCode=0 Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.864936 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerDied","Data":"1aff985c4d465af81432b2c0fd1da9cb01f3378b2087e04530a854de44547a92"} Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.865008 4825 scope.go:117] "RemoveContainer" containerID="363a8a3b4b4e09ebede35ca07198927c54a01eb1008f7fc708faf1a573e0f6cc" Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.866045 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h9q88" event={"ID":"d7287c8b-6db9-4ec7-b7a0-52fd36aec363","Type":"ContainerStarted","Data":"28ac89eea36bc0d3fd445cad5484c80f13b46eb919b77ae9b0f8a1606a5cc01c"} Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.867193 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sd7s7" event={"ID":"15d8205a-b357-40f1-813d-e42c9d6ac2f0","Type":"ContainerStarted","Data":"3f65611d4bfe492c465b68f551349c156f6204bef96d49ee0295d3275b36cdd2"} Oct 07 19:14:05 crc kubenswrapper[4825]: I1007 19:14:05.886400 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-6vp5x" podStartSLOduration=1.8863767 podStartE2EDuration="1.8863767s" podCreationTimestamp="2025-10-07 19:14:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:14:05.882271709 +0000 UTC m=+834.704310366" watchObservedRunningTime="2025-10-07 19:14:05.8863767 +0000 UTC m=+834.708415347" Oct 07 19:14:06 crc kubenswrapper[4825]: I1007 19:14:06.564979 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e705456a-fdcd-4d7e-b3e9-0146cf587db8-memberlist\") pod \"speaker-x5hwq\" (UID: \"e705456a-fdcd-4d7e-b3e9-0146cf587db8\") " pod="metallb-system/speaker-x5hwq" Oct 07 19:14:06 crc kubenswrapper[4825]: I1007 19:14:06.572546 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e705456a-fdcd-4d7e-b3e9-0146cf587db8-memberlist\") pod \"speaker-x5hwq\" (UID: \"e705456a-fdcd-4d7e-b3e9-0146cf587db8\") " pod="metallb-system/speaker-x5hwq" Oct 07 19:14:06 crc kubenswrapper[4825]: I1007 19:14:06.574117 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-x5hwq" Oct 07 19:14:06 crc kubenswrapper[4825]: I1007 19:14:06.879364 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerStarted","Data":"906a228c17f7770f9388dbe04c2d4927f00d2c55a2ece21a3cd466abc03da78e"} Oct 07 19:14:06 crc kubenswrapper[4825]: I1007 19:14:06.885449 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-x5hwq" event={"ID":"e705456a-fdcd-4d7e-b3e9-0146cf587db8","Type":"ContainerStarted","Data":"86708c717339be4a4b3a4e6686e5c5003f92e12189d85aa2240454af5c284d99"} Oct 07 19:14:07 crc kubenswrapper[4825]: I1007 19:14:07.903993 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-x5hwq" event={"ID":"e705456a-fdcd-4d7e-b3e9-0146cf587db8","Type":"ContainerStarted","Data":"19d85e11006619862017e07556b38936c77aa5693021d8248151d98dd5dac7a7"} Oct 07 19:14:07 crc kubenswrapper[4825]: I1007 19:14:07.904058 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-x5hwq" event={"ID":"e705456a-fdcd-4d7e-b3e9-0146cf587db8","Type":"ContainerStarted","Data":"4c3e24a7be7a08e85a6f8b0b0d6122d09608ceda31601e7d6703b7c972f89e52"} Oct 07 19:14:07 crc kubenswrapper[4825]: I1007 19:14:07.904104 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-x5hwq" Oct 07 19:14:07 crc kubenswrapper[4825]: I1007 19:14:07.931752 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-x5hwq" podStartSLOduration=3.931720774 podStartE2EDuration="3.931720774s" podCreationTimestamp="2025-10-07 19:14:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:14:07.928623654 +0000 UTC m=+836.750662291" watchObservedRunningTime="2025-10-07 19:14:07.931720774 +0000 UTC m=+836.753759411" Oct 07 19:14:15 crc kubenswrapper[4825]: I1007 19:14:15.130454 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-6vp5x" Oct 07 19:14:16 crc kubenswrapper[4825]: I1007 19:14:16.579824 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-x5hwq" Oct 07 19:14:19 crc kubenswrapper[4825]: I1007 19:14:19.376298 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-ghkz5"] Oct 07 19:14:19 crc kubenswrapper[4825]: I1007 19:14:19.377784 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ghkz5" Oct 07 19:14:19 crc kubenswrapper[4825]: I1007 19:14:19.378568 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggmhk\" (UniqueName: \"kubernetes.io/projected/00896ba4-a0f5-43ca-88a1-d684239d2558-kube-api-access-ggmhk\") pod \"openstack-operator-index-ghkz5\" (UID: \"00896ba4-a0f5-43ca-88a1-d684239d2558\") " pod="openstack-operators/openstack-operator-index-ghkz5" Oct 07 19:14:19 crc kubenswrapper[4825]: I1007 19:14:19.379659 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-t7q29" Oct 07 19:14:19 crc kubenswrapper[4825]: I1007 19:14:19.379674 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 07 19:14:19 crc kubenswrapper[4825]: I1007 19:14:19.380568 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 07 19:14:19 crc kubenswrapper[4825]: I1007 19:14:19.399368 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ghkz5"] Oct 07 19:14:19 crc kubenswrapper[4825]: I1007 19:14:19.479946 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggmhk\" (UniqueName: \"kubernetes.io/projected/00896ba4-a0f5-43ca-88a1-d684239d2558-kube-api-access-ggmhk\") pod \"openstack-operator-index-ghkz5\" (UID: \"00896ba4-a0f5-43ca-88a1-d684239d2558\") " pod="openstack-operators/openstack-operator-index-ghkz5" Oct 07 19:14:19 crc kubenswrapper[4825]: I1007 19:14:19.496780 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggmhk\" (UniqueName: \"kubernetes.io/projected/00896ba4-a0f5-43ca-88a1-d684239d2558-kube-api-access-ggmhk\") pod \"openstack-operator-index-ghkz5\" (UID: \"00896ba4-a0f5-43ca-88a1-d684239d2558\") " pod="openstack-operators/openstack-operator-index-ghkz5" Oct 07 19:14:19 crc kubenswrapper[4825]: I1007 19:14:19.693313 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ghkz5" Oct 07 19:14:21 crc kubenswrapper[4825]: I1007 19:14:21.496719 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ghkz5"] Oct 07 19:14:21 crc kubenswrapper[4825]: W1007 19:14:21.507040 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00896ba4_a0f5_43ca_88a1_d684239d2558.slice/crio-b0d25fa11fd6acbebe00dae28b0bec224da589d62c9e7b41aa2f23e239163486 WatchSource:0}: Error finding container b0d25fa11fd6acbebe00dae28b0bec224da589d62c9e7b41aa2f23e239163486: Status 404 returned error can't find the container with id b0d25fa11fd6acbebe00dae28b0bec224da589d62c9e7b41aa2f23e239163486 Oct 07 19:14:22 crc kubenswrapper[4825]: I1007 19:14:22.023055 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sd7s7" event={"ID":"15d8205a-b357-40f1-813d-e42c9d6ac2f0","Type":"ContainerStarted","Data":"01a7bbe4175bc125e2bd3e57271fd455ad79202bca11ec0ba102e80ef6ac0d51"} Oct 07 19:14:22 crc kubenswrapper[4825]: I1007 19:14:22.023318 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sd7s7" Oct 07 19:14:22 crc kubenswrapper[4825]: I1007 19:14:22.025633 4825 generic.go:334] "Generic (PLEG): container finished" podID="d7287c8b-6db9-4ec7-b7a0-52fd36aec363" containerID="4a4ed327a70ea3fc269afdd54ac273ebaf729be8f1e83a5560b26ffe212a26af" exitCode=0 Oct 07 19:14:22 crc kubenswrapper[4825]: I1007 19:14:22.025730 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h9q88" event={"ID":"d7287c8b-6db9-4ec7-b7a0-52fd36aec363","Type":"ContainerDied","Data":"4a4ed327a70ea3fc269afdd54ac273ebaf729be8f1e83a5560b26ffe212a26af"} Oct 07 19:14:22 crc kubenswrapper[4825]: I1007 19:14:22.027134 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ghkz5" event={"ID":"00896ba4-a0f5-43ca-88a1-d684239d2558","Type":"ContainerStarted","Data":"b0d25fa11fd6acbebe00dae28b0bec224da589d62c9e7b41aa2f23e239163486"} Oct 07 19:14:22 crc kubenswrapper[4825]: I1007 19:14:22.044311 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sd7s7" podStartSLOduration=2.123144161 podStartE2EDuration="18.044282947s" podCreationTimestamp="2025-10-07 19:14:04 +0000 UTC" firstStartedPulling="2025-10-07 19:14:05.266829237 +0000 UTC m=+834.088867864" lastFinishedPulling="2025-10-07 19:14:21.187968013 +0000 UTC m=+850.010006650" observedRunningTime="2025-10-07 19:14:22.042134208 +0000 UTC m=+850.864172905" watchObservedRunningTime="2025-10-07 19:14:22.044282947 +0000 UTC m=+850.866321584" Oct 07 19:14:22 crc kubenswrapper[4825]: I1007 19:14:22.749517 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-ghkz5"] Oct 07 19:14:23 crc kubenswrapper[4825]: I1007 19:14:23.036776 4825 generic.go:334] "Generic (PLEG): container finished" podID="d7287c8b-6db9-4ec7-b7a0-52fd36aec363" containerID="dbc01bd7634af85d01dcae331357de8a67dcc2c75e3a1d16507531593383f433" exitCode=0 Oct 07 19:14:23 crc kubenswrapper[4825]: I1007 19:14:23.036854 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h9q88" event={"ID":"d7287c8b-6db9-4ec7-b7a0-52fd36aec363","Type":"ContainerDied","Data":"dbc01bd7634af85d01dcae331357de8a67dcc2c75e3a1d16507531593383f433"} Oct 07 19:14:23 crc kubenswrapper[4825]: I1007 19:14:23.355491 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-l89d7"] Oct 07 19:14:23 crc kubenswrapper[4825]: I1007 19:14:23.356328 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-l89d7" Oct 07 19:14:23 crc kubenswrapper[4825]: I1007 19:14:23.366678 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-l89d7"] Oct 07 19:14:23 crc kubenswrapper[4825]: I1007 19:14:23.483090 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8hx8\" (UniqueName: \"kubernetes.io/projected/fd7a1b83-b50f-41c1-8092-ce7135ffe155-kube-api-access-j8hx8\") pod \"openstack-operator-index-l89d7\" (UID: \"fd7a1b83-b50f-41c1-8092-ce7135ffe155\") " pod="openstack-operators/openstack-operator-index-l89d7" Oct 07 19:14:23 crc kubenswrapper[4825]: I1007 19:14:23.584693 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8hx8\" (UniqueName: \"kubernetes.io/projected/fd7a1b83-b50f-41c1-8092-ce7135ffe155-kube-api-access-j8hx8\") pod \"openstack-operator-index-l89d7\" (UID: \"fd7a1b83-b50f-41c1-8092-ce7135ffe155\") " pod="openstack-operators/openstack-operator-index-l89d7" Oct 07 19:14:23 crc kubenswrapper[4825]: I1007 19:14:23.618799 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8hx8\" (UniqueName: \"kubernetes.io/projected/fd7a1b83-b50f-41c1-8092-ce7135ffe155-kube-api-access-j8hx8\") pod \"openstack-operator-index-l89d7\" (UID: \"fd7a1b83-b50f-41c1-8092-ce7135ffe155\") " pod="openstack-operators/openstack-operator-index-l89d7" Oct 07 19:14:23 crc kubenswrapper[4825]: I1007 19:14:23.703868 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-l89d7" Oct 07 19:14:24 crc kubenswrapper[4825]: I1007 19:14:24.422959 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-l89d7"] Oct 07 19:14:24 crc kubenswrapper[4825]: W1007 19:14:24.434180 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd7a1b83_b50f_41c1_8092_ce7135ffe155.slice/crio-a0baee5a003e1f7b4e02399ac272d953059b30deb4c50d557e7bcded6014b3e0 WatchSource:0}: Error finding container a0baee5a003e1f7b4e02399ac272d953059b30deb4c50d557e7bcded6014b3e0: Status 404 returned error can't find the container with id a0baee5a003e1f7b4e02399ac272d953059b30deb4c50d557e7bcded6014b3e0 Oct 07 19:14:25 crc kubenswrapper[4825]: I1007 19:14:25.050401 4825 generic.go:334] "Generic (PLEG): container finished" podID="d7287c8b-6db9-4ec7-b7a0-52fd36aec363" containerID="6a04bfe1cdde405bea063b803e10f9331dda2ee91b828a1151379293fbbe4d42" exitCode=0 Oct 07 19:14:25 crc kubenswrapper[4825]: I1007 19:14:25.050481 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h9q88" event={"ID":"d7287c8b-6db9-4ec7-b7a0-52fd36aec363","Type":"ContainerDied","Data":"6a04bfe1cdde405bea063b803e10f9331dda2ee91b828a1151379293fbbe4d42"} Oct 07 19:14:25 crc kubenswrapper[4825]: I1007 19:14:25.052215 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ghkz5" event={"ID":"00896ba4-a0f5-43ca-88a1-d684239d2558","Type":"ContainerStarted","Data":"29284f49e47267c229221ee727d0b735a0e2ee5b3b54f66e7242510db204b626"} Oct 07 19:14:25 crc kubenswrapper[4825]: I1007 19:14:25.052270 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-ghkz5" podUID="00896ba4-a0f5-43ca-88a1-d684239d2558" containerName="registry-server" containerID="cri-o://29284f49e47267c229221ee727d0b735a0e2ee5b3b54f66e7242510db204b626" gracePeriod=2 Oct 07 19:14:25 crc kubenswrapper[4825]: I1007 19:14:25.053645 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-l89d7" event={"ID":"fd7a1b83-b50f-41c1-8092-ce7135ffe155","Type":"ContainerStarted","Data":"52f213666b440b878bf6174ed517d1750ec0199258da6b79971400acbc866dfc"} Oct 07 19:14:25 crc kubenswrapper[4825]: I1007 19:14:25.053692 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-l89d7" event={"ID":"fd7a1b83-b50f-41c1-8092-ce7135ffe155","Type":"ContainerStarted","Data":"a0baee5a003e1f7b4e02399ac272d953059b30deb4c50d557e7bcded6014b3e0"} Oct 07 19:14:25 crc kubenswrapper[4825]: I1007 19:14:25.099589 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-ghkz5" podStartSLOduration=3.3868790300000002 podStartE2EDuration="6.099567638s" podCreationTimestamp="2025-10-07 19:14:19 +0000 UTC" firstStartedPulling="2025-10-07 19:14:21.524959167 +0000 UTC m=+850.346997804" lastFinishedPulling="2025-10-07 19:14:24.237647775 +0000 UTC m=+853.059686412" observedRunningTime="2025-10-07 19:14:25.095652913 +0000 UTC m=+853.917691550" watchObservedRunningTime="2025-10-07 19:14:25.099567638 +0000 UTC m=+853.921606285" Oct 07 19:14:25 crc kubenswrapper[4825]: I1007 19:14:25.114737 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-l89d7" podStartSLOduration=2.055077742 podStartE2EDuration="2.114718834s" podCreationTimestamp="2025-10-07 19:14:23 +0000 UTC" firstStartedPulling="2025-10-07 19:14:24.438593197 +0000 UTC m=+853.260631834" lastFinishedPulling="2025-10-07 19:14:24.498234289 +0000 UTC m=+853.320272926" observedRunningTime="2025-10-07 19:14:25.108832005 +0000 UTC m=+853.930870652" watchObservedRunningTime="2025-10-07 19:14:25.114718834 +0000 UTC m=+853.936757471" Oct 07 19:14:25 crc kubenswrapper[4825]: I1007 19:14:25.425011 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ghkz5" Oct 07 19:14:25 crc kubenswrapper[4825]: I1007 19:14:25.619897 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggmhk\" (UniqueName: \"kubernetes.io/projected/00896ba4-a0f5-43ca-88a1-d684239d2558-kube-api-access-ggmhk\") pod \"00896ba4-a0f5-43ca-88a1-d684239d2558\" (UID: \"00896ba4-a0f5-43ca-88a1-d684239d2558\") " Oct 07 19:14:25 crc kubenswrapper[4825]: I1007 19:14:25.624884 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00896ba4-a0f5-43ca-88a1-d684239d2558-kube-api-access-ggmhk" (OuterVolumeSpecName: "kube-api-access-ggmhk") pod "00896ba4-a0f5-43ca-88a1-d684239d2558" (UID: "00896ba4-a0f5-43ca-88a1-d684239d2558"). InnerVolumeSpecName "kube-api-access-ggmhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:14:25 crc kubenswrapper[4825]: I1007 19:14:25.724401 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggmhk\" (UniqueName: \"kubernetes.io/projected/00896ba4-a0f5-43ca-88a1-d684239d2558-kube-api-access-ggmhk\") on node \"crc\" DevicePath \"\"" Oct 07 19:14:26 crc kubenswrapper[4825]: I1007 19:14:26.065009 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h9q88" event={"ID":"d7287c8b-6db9-4ec7-b7a0-52fd36aec363","Type":"ContainerStarted","Data":"b6889c01e13cb5cdad08d8e62bdc19ac362212358cb186999eb2ca4eab6d17c2"} Oct 07 19:14:26 crc kubenswrapper[4825]: I1007 19:14:26.065058 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h9q88" event={"ID":"d7287c8b-6db9-4ec7-b7a0-52fd36aec363","Type":"ContainerStarted","Data":"ad499e1474f1da08c0b840f31d51cc9eec52b3f2ef4b84ac685e92c004d41136"} Oct 07 19:14:26 crc kubenswrapper[4825]: I1007 19:14:26.065072 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h9q88" event={"ID":"d7287c8b-6db9-4ec7-b7a0-52fd36aec363","Type":"ContainerStarted","Data":"d4511634ce713e409bd0feaa3805ec5f5ffc2a2eca16e87a8b7643715b6aa778"} Oct 07 19:14:26 crc kubenswrapper[4825]: I1007 19:14:26.065084 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h9q88" event={"ID":"d7287c8b-6db9-4ec7-b7a0-52fd36aec363","Type":"ContainerStarted","Data":"33773b4e40c270c43a957da1d9308fcf04f1fea9bd06ed884b67820c08c6abc1"} Oct 07 19:14:26 crc kubenswrapper[4825]: I1007 19:14:26.065095 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h9q88" event={"ID":"d7287c8b-6db9-4ec7-b7a0-52fd36aec363","Type":"ContainerStarted","Data":"849e6dbfead051eca8ce4283de70ccab82f6fb691c17a58d0a9392e71bbc9677"} Oct 07 19:14:26 crc kubenswrapper[4825]: I1007 19:14:26.065107 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h9q88" event={"ID":"d7287c8b-6db9-4ec7-b7a0-52fd36aec363","Type":"ContainerStarted","Data":"55301374b20da7abd86d755b39d4dbf3a74738c39ec40b2efb586ba53502c7b6"} Oct 07 19:14:26 crc kubenswrapper[4825]: I1007 19:14:26.065194 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:26 crc kubenswrapper[4825]: I1007 19:14:26.066902 4825 generic.go:334] "Generic (PLEG): container finished" podID="00896ba4-a0f5-43ca-88a1-d684239d2558" containerID="29284f49e47267c229221ee727d0b735a0e2ee5b3b54f66e7242510db204b626" exitCode=0 Oct 07 19:14:26 crc kubenswrapper[4825]: I1007 19:14:26.066999 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ghkz5" Oct 07 19:14:26 crc kubenswrapper[4825]: I1007 19:14:26.067010 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ghkz5" event={"ID":"00896ba4-a0f5-43ca-88a1-d684239d2558","Type":"ContainerDied","Data":"29284f49e47267c229221ee727d0b735a0e2ee5b3b54f66e7242510db204b626"} Oct 07 19:14:26 crc kubenswrapper[4825]: I1007 19:14:26.067056 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ghkz5" event={"ID":"00896ba4-a0f5-43ca-88a1-d684239d2558","Type":"ContainerDied","Data":"b0d25fa11fd6acbebe00dae28b0bec224da589d62c9e7b41aa2f23e239163486"} Oct 07 19:14:26 crc kubenswrapper[4825]: I1007 19:14:26.067080 4825 scope.go:117] "RemoveContainer" containerID="29284f49e47267c229221ee727d0b735a0e2ee5b3b54f66e7242510db204b626" Oct 07 19:14:26 crc kubenswrapper[4825]: I1007 19:14:26.102453 4825 scope.go:117] "RemoveContainer" containerID="29284f49e47267c229221ee727d0b735a0e2ee5b3b54f66e7242510db204b626" Oct 07 19:14:26 crc kubenswrapper[4825]: E1007 19:14:26.105446 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29284f49e47267c229221ee727d0b735a0e2ee5b3b54f66e7242510db204b626\": container with ID starting with 29284f49e47267c229221ee727d0b735a0e2ee5b3b54f66e7242510db204b626 not found: ID does not exist" containerID="29284f49e47267c229221ee727d0b735a0e2ee5b3b54f66e7242510db204b626" Oct 07 19:14:26 crc kubenswrapper[4825]: I1007 19:14:26.105492 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29284f49e47267c229221ee727d0b735a0e2ee5b3b54f66e7242510db204b626"} err="failed to get container status \"29284f49e47267c229221ee727d0b735a0e2ee5b3b54f66e7242510db204b626\": rpc error: code = NotFound desc = could not find container \"29284f49e47267c229221ee727d0b735a0e2ee5b3b54f66e7242510db204b626\": container with ID starting with 29284f49e47267c229221ee727d0b735a0e2ee5b3b54f66e7242510db204b626 not found: ID does not exist" Oct 07 19:14:26 crc kubenswrapper[4825]: I1007 19:14:26.127706 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-h9q88" podStartSLOduration=6.086894857 podStartE2EDuration="22.127684029s" podCreationTimestamp="2025-10-07 19:14:04 +0000 UTC" firstStartedPulling="2025-10-07 19:14:05.169648982 +0000 UTC m=+833.991687619" lastFinishedPulling="2025-10-07 19:14:21.210438114 +0000 UTC m=+850.032476791" observedRunningTime="2025-10-07 19:14:26.122761761 +0000 UTC m=+854.944800398" watchObservedRunningTime="2025-10-07 19:14:26.127684029 +0000 UTC m=+854.949722666" Oct 07 19:14:26 crc kubenswrapper[4825]: I1007 19:14:26.143485 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-ghkz5"] Oct 07 19:14:26 crc kubenswrapper[4825]: I1007 19:14:26.149422 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-ghkz5"] Oct 07 19:14:27 crc kubenswrapper[4825]: I1007 19:14:27.802359 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00896ba4-a0f5-43ca-88a1-d684239d2558" path="/var/lib/kubelet/pods/00896ba4-a0f5-43ca-88a1-d684239d2558/volumes" Oct 07 19:14:29 crc kubenswrapper[4825]: I1007 19:14:29.984922 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:30 crc kubenswrapper[4825]: I1007 19:14:30.038711 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:33 crc kubenswrapper[4825]: I1007 19:14:33.704413 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-l89d7" Oct 07 19:14:33 crc kubenswrapper[4825]: I1007 19:14:33.705684 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-l89d7" Oct 07 19:14:33 crc kubenswrapper[4825]: I1007 19:14:33.750857 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-l89d7" Oct 07 19:14:34 crc kubenswrapper[4825]: I1007 19:14:34.164282 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-l89d7" Oct 07 19:14:35 crc kubenswrapper[4825]: I1007 19:14:35.003913 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sd7s7" Oct 07 19:14:41 crc kubenswrapper[4825]: I1007 19:14:41.081376 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2"] Oct 07 19:14:41 crc kubenswrapper[4825]: E1007 19:14:41.083467 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00896ba4-a0f5-43ca-88a1-d684239d2558" containerName="registry-server" Oct 07 19:14:41 crc kubenswrapper[4825]: I1007 19:14:41.083504 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="00896ba4-a0f5-43ca-88a1-d684239d2558" containerName="registry-server" Oct 07 19:14:41 crc kubenswrapper[4825]: I1007 19:14:41.084166 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="00896ba4-a0f5-43ca-88a1-d684239d2558" containerName="registry-server" Oct 07 19:14:41 crc kubenswrapper[4825]: I1007 19:14:41.087938 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2" Oct 07 19:14:41 crc kubenswrapper[4825]: I1007 19:14:41.097438 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-2n6fn" Oct 07 19:14:41 crc kubenswrapper[4825]: I1007 19:14:41.104458 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2"] Oct 07 19:14:41 crc kubenswrapper[4825]: I1007 19:14:41.156472 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/513bdedd-0708-4b31-afc3-93beee6324dd-util\") pod \"886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2\" (UID: \"513bdedd-0708-4b31-afc3-93beee6324dd\") " pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2" Oct 07 19:14:41 crc kubenswrapper[4825]: I1007 19:14:41.156700 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd8mz\" (UniqueName: \"kubernetes.io/projected/513bdedd-0708-4b31-afc3-93beee6324dd-kube-api-access-pd8mz\") pod \"886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2\" (UID: \"513bdedd-0708-4b31-afc3-93beee6324dd\") " pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2" Oct 07 19:14:41 crc kubenswrapper[4825]: I1007 19:14:41.156923 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/513bdedd-0708-4b31-afc3-93beee6324dd-bundle\") pod \"886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2\" (UID: \"513bdedd-0708-4b31-afc3-93beee6324dd\") " pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2" Oct 07 19:14:41 crc kubenswrapper[4825]: I1007 19:14:41.257819 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd8mz\" (UniqueName: \"kubernetes.io/projected/513bdedd-0708-4b31-afc3-93beee6324dd-kube-api-access-pd8mz\") pod \"886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2\" (UID: \"513bdedd-0708-4b31-afc3-93beee6324dd\") " pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2" Oct 07 19:14:41 crc kubenswrapper[4825]: I1007 19:14:41.257918 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/513bdedd-0708-4b31-afc3-93beee6324dd-bundle\") pod \"886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2\" (UID: \"513bdedd-0708-4b31-afc3-93beee6324dd\") " pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2" Oct 07 19:14:41 crc kubenswrapper[4825]: I1007 19:14:41.257949 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/513bdedd-0708-4b31-afc3-93beee6324dd-util\") pod \"886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2\" (UID: \"513bdedd-0708-4b31-afc3-93beee6324dd\") " pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2" Oct 07 19:14:41 crc kubenswrapper[4825]: I1007 19:14:41.258482 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/513bdedd-0708-4b31-afc3-93beee6324dd-util\") pod \"886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2\" (UID: \"513bdedd-0708-4b31-afc3-93beee6324dd\") " pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2" Oct 07 19:14:41 crc kubenswrapper[4825]: I1007 19:14:41.258697 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/513bdedd-0708-4b31-afc3-93beee6324dd-bundle\") pod \"886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2\" (UID: \"513bdedd-0708-4b31-afc3-93beee6324dd\") " pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2" Oct 07 19:14:41 crc kubenswrapper[4825]: I1007 19:14:41.278909 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd8mz\" (UniqueName: \"kubernetes.io/projected/513bdedd-0708-4b31-afc3-93beee6324dd-kube-api-access-pd8mz\") pod \"886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2\" (UID: \"513bdedd-0708-4b31-afc3-93beee6324dd\") " pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2" Oct 07 19:14:41 crc kubenswrapper[4825]: I1007 19:14:41.430073 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2" Oct 07 19:14:41 crc kubenswrapper[4825]: I1007 19:14:41.913471 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2"] Oct 07 19:14:41 crc kubenswrapper[4825]: W1007 19:14:41.921408 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod513bdedd_0708_4b31_afc3_93beee6324dd.slice/crio-446ad4aa3a7f7f4b8fbd72e528da9752e01ebdba174e974ef9a488745160566c WatchSource:0}: Error finding container 446ad4aa3a7f7f4b8fbd72e528da9752e01ebdba174e974ef9a488745160566c: Status 404 returned error can't find the container with id 446ad4aa3a7f7f4b8fbd72e528da9752e01ebdba174e974ef9a488745160566c Oct 07 19:14:42 crc kubenswrapper[4825]: I1007 19:14:42.197341 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2" event={"ID":"513bdedd-0708-4b31-afc3-93beee6324dd","Type":"ContainerStarted","Data":"dd608a30b57b4358f1099de457141afe151236ae35cb8dade22e0b8d4bc75dce"} Oct 07 19:14:42 crc kubenswrapper[4825]: I1007 19:14:42.198311 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2" event={"ID":"513bdedd-0708-4b31-afc3-93beee6324dd","Type":"ContainerStarted","Data":"446ad4aa3a7f7f4b8fbd72e528da9752e01ebdba174e974ef9a488745160566c"} Oct 07 19:14:43 crc kubenswrapper[4825]: I1007 19:14:43.206196 4825 generic.go:334] "Generic (PLEG): container finished" podID="513bdedd-0708-4b31-afc3-93beee6324dd" containerID="dd608a30b57b4358f1099de457141afe151236ae35cb8dade22e0b8d4bc75dce" exitCode=0 Oct 07 19:14:43 crc kubenswrapper[4825]: I1007 19:14:43.206281 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2" event={"ID":"513bdedd-0708-4b31-afc3-93beee6324dd","Type":"ContainerDied","Data":"dd608a30b57b4358f1099de457141afe151236ae35cb8dade22e0b8d4bc75dce"} Oct 07 19:14:44 crc kubenswrapper[4825]: I1007 19:14:44.217440 4825 generic.go:334] "Generic (PLEG): container finished" podID="513bdedd-0708-4b31-afc3-93beee6324dd" containerID="803b242477f3ec8b2e08d1e648fc3fa9fe05eb3abcad4037589937d1e2f488ea" exitCode=0 Oct 07 19:14:44 crc kubenswrapper[4825]: I1007 19:14:44.217532 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2" event={"ID":"513bdedd-0708-4b31-afc3-93beee6324dd","Type":"ContainerDied","Data":"803b242477f3ec8b2e08d1e648fc3fa9fe05eb3abcad4037589937d1e2f488ea"} Oct 07 19:14:44 crc kubenswrapper[4825]: I1007 19:14:44.988826 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-h9q88" Oct 07 19:14:45 crc kubenswrapper[4825]: I1007 19:14:45.229420 4825 generic.go:334] "Generic (PLEG): container finished" podID="513bdedd-0708-4b31-afc3-93beee6324dd" containerID="fde5784c8a4decf879f3de71cd5d83e72372a73f40932d6c30ba6ddc06a4f0ee" exitCode=0 Oct 07 19:14:45 crc kubenswrapper[4825]: I1007 19:14:45.229486 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2" event={"ID":"513bdedd-0708-4b31-afc3-93beee6324dd","Type":"ContainerDied","Data":"fde5784c8a4decf879f3de71cd5d83e72372a73f40932d6c30ba6ddc06a4f0ee"} Oct 07 19:14:46 crc kubenswrapper[4825]: I1007 19:14:46.504896 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2" Oct 07 19:14:46 crc kubenswrapper[4825]: I1007 19:14:46.540684 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/513bdedd-0708-4b31-afc3-93beee6324dd-bundle\") pod \"513bdedd-0708-4b31-afc3-93beee6324dd\" (UID: \"513bdedd-0708-4b31-afc3-93beee6324dd\") " Oct 07 19:14:46 crc kubenswrapper[4825]: I1007 19:14:46.540757 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd8mz\" (UniqueName: \"kubernetes.io/projected/513bdedd-0708-4b31-afc3-93beee6324dd-kube-api-access-pd8mz\") pod \"513bdedd-0708-4b31-afc3-93beee6324dd\" (UID: \"513bdedd-0708-4b31-afc3-93beee6324dd\") " Oct 07 19:14:46 crc kubenswrapper[4825]: I1007 19:14:46.540805 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/513bdedd-0708-4b31-afc3-93beee6324dd-util\") pod \"513bdedd-0708-4b31-afc3-93beee6324dd\" (UID: \"513bdedd-0708-4b31-afc3-93beee6324dd\") " Oct 07 19:14:46 crc kubenswrapper[4825]: I1007 19:14:46.542017 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/513bdedd-0708-4b31-afc3-93beee6324dd-bundle" (OuterVolumeSpecName: "bundle") pod "513bdedd-0708-4b31-afc3-93beee6324dd" (UID: "513bdedd-0708-4b31-afc3-93beee6324dd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:14:46 crc kubenswrapper[4825]: I1007 19:14:46.547544 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/513bdedd-0708-4b31-afc3-93beee6324dd-kube-api-access-pd8mz" (OuterVolumeSpecName: "kube-api-access-pd8mz") pod "513bdedd-0708-4b31-afc3-93beee6324dd" (UID: "513bdedd-0708-4b31-afc3-93beee6324dd"). InnerVolumeSpecName "kube-api-access-pd8mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:14:46 crc kubenswrapper[4825]: I1007 19:14:46.554900 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/513bdedd-0708-4b31-afc3-93beee6324dd-util" (OuterVolumeSpecName: "util") pod "513bdedd-0708-4b31-afc3-93beee6324dd" (UID: "513bdedd-0708-4b31-afc3-93beee6324dd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:14:46 crc kubenswrapper[4825]: I1007 19:14:46.641877 4825 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/513bdedd-0708-4b31-afc3-93beee6324dd-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:14:46 crc kubenswrapper[4825]: I1007 19:14:46.641907 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd8mz\" (UniqueName: \"kubernetes.io/projected/513bdedd-0708-4b31-afc3-93beee6324dd-kube-api-access-pd8mz\") on node \"crc\" DevicePath \"\"" Oct 07 19:14:46 crc kubenswrapper[4825]: I1007 19:14:46.641920 4825 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/513bdedd-0708-4b31-afc3-93beee6324dd-util\") on node \"crc\" DevicePath \"\"" Oct 07 19:14:47 crc kubenswrapper[4825]: I1007 19:14:47.247613 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2" event={"ID":"513bdedd-0708-4b31-afc3-93beee6324dd","Type":"ContainerDied","Data":"446ad4aa3a7f7f4b8fbd72e528da9752e01ebdba174e974ef9a488745160566c"} Oct 07 19:14:47 crc kubenswrapper[4825]: I1007 19:14:47.247677 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="446ad4aa3a7f7f4b8fbd72e528da9752e01ebdba174e974ef9a488745160566c" Oct 07 19:14:47 crc kubenswrapper[4825]: I1007 19:14:47.247679 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2" Oct 07 19:14:53 crc kubenswrapper[4825]: I1007 19:14:53.671096 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6687d89476-w4tpc"] Oct 07 19:14:53 crc kubenswrapper[4825]: E1007 19:14:53.671968 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="513bdedd-0708-4b31-afc3-93beee6324dd" containerName="util" Oct 07 19:14:53 crc kubenswrapper[4825]: I1007 19:14:53.671991 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="513bdedd-0708-4b31-afc3-93beee6324dd" containerName="util" Oct 07 19:14:53 crc kubenswrapper[4825]: E1007 19:14:53.672020 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="513bdedd-0708-4b31-afc3-93beee6324dd" containerName="extract" Oct 07 19:14:53 crc kubenswrapper[4825]: I1007 19:14:53.672033 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="513bdedd-0708-4b31-afc3-93beee6324dd" containerName="extract" Oct 07 19:14:53 crc kubenswrapper[4825]: E1007 19:14:53.672054 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="513bdedd-0708-4b31-afc3-93beee6324dd" containerName="pull" Oct 07 19:14:53 crc kubenswrapper[4825]: I1007 19:14:53.672067 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="513bdedd-0708-4b31-afc3-93beee6324dd" containerName="pull" Oct 07 19:14:53 crc kubenswrapper[4825]: I1007 19:14:53.672301 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="513bdedd-0708-4b31-afc3-93beee6324dd" containerName="extract" Oct 07 19:14:53 crc kubenswrapper[4825]: I1007 19:14:53.673346 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6687d89476-w4tpc" Oct 07 19:14:53 crc kubenswrapper[4825]: I1007 19:14:53.675446 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-jwpf8" Oct 07 19:14:53 crc kubenswrapper[4825]: I1007 19:14:53.696283 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6687d89476-w4tpc"] Oct 07 19:14:53 crc kubenswrapper[4825]: I1007 19:14:53.773475 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzc9t\" (UniqueName: \"kubernetes.io/projected/0b0eb630-7794-4425-9ada-29b15acb6bdb-kube-api-access-fzc9t\") pod \"openstack-operator-controller-operator-6687d89476-w4tpc\" (UID: \"0b0eb630-7794-4425-9ada-29b15acb6bdb\") " pod="openstack-operators/openstack-operator-controller-operator-6687d89476-w4tpc" Oct 07 19:14:53 crc kubenswrapper[4825]: I1007 19:14:53.875414 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzc9t\" (UniqueName: \"kubernetes.io/projected/0b0eb630-7794-4425-9ada-29b15acb6bdb-kube-api-access-fzc9t\") pod \"openstack-operator-controller-operator-6687d89476-w4tpc\" (UID: \"0b0eb630-7794-4425-9ada-29b15acb6bdb\") " pod="openstack-operators/openstack-operator-controller-operator-6687d89476-w4tpc" Oct 07 19:14:53 crc kubenswrapper[4825]: I1007 19:14:53.898799 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzc9t\" (UniqueName: \"kubernetes.io/projected/0b0eb630-7794-4425-9ada-29b15acb6bdb-kube-api-access-fzc9t\") pod \"openstack-operator-controller-operator-6687d89476-w4tpc\" (UID: \"0b0eb630-7794-4425-9ada-29b15acb6bdb\") " pod="openstack-operators/openstack-operator-controller-operator-6687d89476-w4tpc" Oct 07 19:14:53 crc kubenswrapper[4825]: I1007 19:14:53.992487 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6687d89476-w4tpc" Oct 07 19:14:54 crc kubenswrapper[4825]: I1007 19:14:54.422943 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6687d89476-w4tpc"] Oct 07 19:14:55 crc kubenswrapper[4825]: I1007 19:14:55.302154 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6687d89476-w4tpc" event={"ID":"0b0eb630-7794-4425-9ada-29b15acb6bdb","Type":"ContainerStarted","Data":"9af6e845bb7d147c183e1bcdd8c19dd9f0199ebe74891d0259e11483eabbb2c8"} Oct 07 19:15:00 crc kubenswrapper[4825]: I1007 19:15:00.142471 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331075-rqvtx"] Oct 07 19:15:00 crc kubenswrapper[4825]: I1007 19:15:00.143803 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331075-rqvtx" Oct 07 19:15:00 crc kubenswrapper[4825]: I1007 19:15:00.146315 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 19:15:00 crc kubenswrapper[4825]: I1007 19:15:00.146743 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 19:15:00 crc kubenswrapper[4825]: I1007 19:15:00.163106 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331075-rqvtx"] Oct 07 19:15:00 crc kubenswrapper[4825]: I1007 19:15:00.266558 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f311fd9-7ab5-4cf4-80ed-701ed5d212ef-secret-volume\") pod \"collect-profiles-29331075-rqvtx\" (UID: \"0f311fd9-7ab5-4cf4-80ed-701ed5d212ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331075-rqvtx" Oct 07 19:15:00 crc kubenswrapper[4825]: I1007 19:15:00.266818 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx228\" (UniqueName: \"kubernetes.io/projected/0f311fd9-7ab5-4cf4-80ed-701ed5d212ef-kube-api-access-bx228\") pod \"collect-profiles-29331075-rqvtx\" (UID: \"0f311fd9-7ab5-4cf4-80ed-701ed5d212ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331075-rqvtx" Oct 07 19:15:00 crc kubenswrapper[4825]: I1007 19:15:00.267071 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f311fd9-7ab5-4cf4-80ed-701ed5d212ef-config-volume\") pod \"collect-profiles-29331075-rqvtx\" (UID: \"0f311fd9-7ab5-4cf4-80ed-701ed5d212ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331075-rqvtx" Oct 07 19:15:00 crc kubenswrapper[4825]: I1007 19:15:00.349058 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6687d89476-w4tpc" event={"ID":"0b0eb630-7794-4425-9ada-29b15acb6bdb","Type":"ContainerStarted","Data":"6dda9295386da70ca5334a73c0e6f06aed1f39f089513c59d7c185b17df71baf"} Oct 07 19:15:00 crc kubenswrapper[4825]: I1007 19:15:00.367832 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f311fd9-7ab5-4cf4-80ed-701ed5d212ef-secret-volume\") pod \"collect-profiles-29331075-rqvtx\" (UID: \"0f311fd9-7ab5-4cf4-80ed-701ed5d212ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331075-rqvtx" Oct 07 19:15:00 crc kubenswrapper[4825]: I1007 19:15:00.368091 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx228\" (UniqueName: \"kubernetes.io/projected/0f311fd9-7ab5-4cf4-80ed-701ed5d212ef-kube-api-access-bx228\") pod \"collect-profiles-29331075-rqvtx\" (UID: \"0f311fd9-7ab5-4cf4-80ed-701ed5d212ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331075-rqvtx" Oct 07 19:15:00 crc kubenswrapper[4825]: I1007 19:15:00.368218 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f311fd9-7ab5-4cf4-80ed-701ed5d212ef-config-volume\") pod \"collect-profiles-29331075-rqvtx\" (UID: \"0f311fd9-7ab5-4cf4-80ed-701ed5d212ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331075-rqvtx" Oct 07 19:15:00 crc kubenswrapper[4825]: I1007 19:15:00.368989 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f311fd9-7ab5-4cf4-80ed-701ed5d212ef-config-volume\") pod \"collect-profiles-29331075-rqvtx\" (UID: \"0f311fd9-7ab5-4cf4-80ed-701ed5d212ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331075-rqvtx" Oct 07 19:15:00 crc kubenswrapper[4825]: I1007 19:15:00.373704 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f311fd9-7ab5-4cf4-80ed-701ed5d212ef-secret-volume\") pod \"collect-profiles-29331075-rqvtx\" (UID: \"0f311fd9-7ab5-4cf4-80ed-701ed5d212ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331075-rqvtx" Oct 07 19:15:00 crc kubenswrapper[4825]: I1007 19:15:00.387743 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx228\" (UniqueName: \"kubernetes.io/projected/0f311fd9-7ab5-4cf4-80ed-701ed5d212ef-kube-api-access-bx228\") pod \"collect-profiles-29331075-rqvtx\" (UID: \"0f311fd9-7ab5-4cf4-80ed-701ed5d212ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331075-rqvtx" Oct 07 19:15:00 crc kubenswrapper[4825]: I1007 19:15:00.513180 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331075-rqvtx" Oct 07 19:15:00 crc kubenswrapper[4825]: I1007 19:15:00.956194 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331075-rqvtx"] Oct 07 19:15:01 crc kubenswrapper[4825]: W1007 19:15:01.083945 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f311fd9_7ab5_4cf4_80ed_701ed5d212ef.slice/crio-c53f30823c37cc1a9b0fa6830b54738bbc69185b106a9a0240d593c9404b42cb WatchSource:0}: Error finding container c53f30823c37cc1a9b0fa6830b54738bbc69185b106a9a0240d593c9404b42cb: Status 404 returned error can't find the container with id c53f30823c37cc1a9b0fa6830b54738bbc69185b106a9a0240d593c9404b42cb Oct 07 19:15:01 crc kubenswrapper[4825]: I1007 19:15:01.356563 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331075-rqvtx" event={"ID":"0f311fd9-7ab5-4cf4-80ed-701ed5d212ef","Type":"ContainerStarted","Data":"1c9a330ee06367529325684f48ea3c7d3f7debbc6089f4b5c52cf6b88464fa9c"} Oct 07 19:15:01 crc kubenswrapper[4825]: I1007 19:15:01.356619 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331075-rqvtx" event={"ID":"0f311fd9-7ab5-4cf4-80ed-701ed5d212ef","Type":"ContainerStarted","Data":"c53f30823c37cc1a9b0fa6830b54738bbc69185b106a9a0240d593c9404b42cb"} Oct 07 19:15:01 crc kubenswrapper[4825]: I1007 19:15:01.387819 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29331075-rqvtx" podStartSLOduration=1.387802515 podStartE2EDuration="1.387802515s" podCreationTimestamp="2025-10-07 19:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:15:01.383696684 +0000 UTC m=+890.205735321" watchObservedRunningTime="2025-10-07 19:15:01.387802515 +0000 UTC m=+890.209841152" Oct 07 19:15:02 crc kubenswrapper[4825]: I1007 19:15:02.362528 4825 generic.go:334] "Generic (PLEG): container finished" podID="0f311fd9-7ab5-4cf4-80ed-701ed5d212ef" containerID="1c9a330ee06367529325684f48ea3c7d3f7debbc6089f4b5c52cf6b88464fa9c" exitCode=0 Oct 07 19:15:02 crc kubenswrapper[4825]: I1007 19:15:02.362626 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331075-rqvtx" event={"ID":"0f311fd9-7ab5-4cf4-80ed-701ed5d212ef","Type":"ContainerDied","Data":"1c9a330ee06367529325684f48ea3c7d3f7debbc6089f4b5c52cf6b88464fa9c"} Oct 07 19:15:03 crc kubenswrapper[4825]: I1007 19:15:03.370010 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6687d89476-w4tpc" event={"ID":"0b0eb630-7794-4425-9ada-29b15acb6bdb","Type":"ContainerStarted","Data":"1cc4468a7efd7781aab281337188c501214543f75b743d51810cb6d035fe973d"} Oct 07 19:15:03 crc kubenswrapper[4825]: I1007 19:15:03.370103 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6687d89476-w4tpc" Oct 07 19:15:03 crc kubenswrapper[4825]: I1007 19:15:03.408945 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6687d89476-w4tpc" podStartSLOduration=2.473508834 podStartE2EDuration="10.408926071s" podCreationTimestamp="2025-10-07 19:14:53 +0000 UTC" firstStartedPulling="2025-10-07 19:14:54.435776675 +0000 UTC m=+883.257815312" lastFinishedPulling="2025-10-07 19:15:02.371193912 +0000 UTC m=+891.193232549" observedRunningTime="2025-10-07 19:15:03.408523158 +0000 UTC m=+892.230561855" watchObservedRunningTime="2025-10-07 19:15:03.408926071 +0000 UTC m=+892.230964708" Oct 07 19:15:03 crc kubenswrapper[4825]: I1007 19:15:03.712722 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331075-rqvtx" Oct 07 19:15:03 crc kubenswrapper[4825]: I1007 19:15:03.828126 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx228\" (UniqueName: \"kubernetes.io/projected/0f311fd9-7ab5-4cf4-80ed-701ed5d212ef-kube-api-access-bx228\") pod \"0f311fd9-7ab5-4cf4-80ed-701ed5d212ef\" (UID: \"0f311fd9-7ab5-4cf4-80ed-701ed5d212ef\") " Oct 07 19:15:03 crc kubenswrapper[4825]: I1007 19:15:03.828165 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f311fd9-7ab5-4cf4-80ed-701ed5d212ef-config-volume\") pod \"0f311fd9-7ab5-4cf4-80ed-701ed5d212ef\" (UID: \"0f311fd9-7ab5-4cf4-80ed-701ed5d212ef\") " Oct 07 19:15:03 crc kubenswrapper[4825]: I1007 19:15:03.828190 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f311fd9-7ab5-4cf4-80ed-701ed5d212ef-secret-volume\") pod \"0f311fd9-7ab5-4cf4-80ed-701ed5d212ef\" (UID: \"0f311fd9-7ab5-4cf4-80ed-701ed5d212ef\") " Oct 07 19:15:03 crc kubenswrapper[4825]: I1007 19:15:03.829104 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f311fd9-7ab5-4cf4-80ed-701ed5d212ef-config-volume" (OuterVolumeSpecName: "config-volume") pod "0f311fd9-7ab5-4cf4-80ed-701ed5d212ef" (UID: "0f311fd9-7ab5-4cf4-80ed-701ed5d212ef"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:15:03 crc kubenswrapper[4825]: I1007 19:15:03.832608 4825 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f311fd9-7ab5-4cf4-80ed-701ed5d212ef-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 19:15:03 crc kubenswrapper[4825]: I1007 19:15:03.835426 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f311fd9-7ab5-4cf4-80ed-701ed5d212ef-kube-api-access-bx228" (OuterVolumeSpecName: "kube-api-access-bx228") pod "0f311fd9-7ab5-4cf4-80ed-701ed5d212ef" (UID: "0f311fd9-7ab5-4cf4-80ed-701ed5d212ef"). InnerVolumeSpecName "kube-api-access-bx228". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:15:03 crc kubenswrapper[4825]: I1007 19:15:03.836283 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f311fd9-7ab5-4cf4-80ed-701ed5d212ef-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0f311fd9-7ab5-4cf4-80ed-701ed5d212ef" (UID: "0f311fd9-7ab5-4cf4-80ed-701ed5d212ef"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:15:03 crc kubenswrapper[4825]: I1007 19:15:03.933866 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx228\" (UniqueName: \"kubernetes.io/projected/0f311fd9-7ab5-4cf4-80ed-701ed5d212ef-kube-api-access-bx228\") on node \"crc\" DevicePath \"\"" Oct 07 19:15:03 crc kubenswrapper[4825]: I1007 19:15:03.933906 4825 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f311fd9-7ab5-4cf4-80ed-701ed5d212ef-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 19:15:04 crc kubenswrapper[4825]: I1007 19:15:04.379257 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331075-rqvtx" event={"ID":"0f311fd9-7ab5-4cf4-80ed-701ed5d212ef","Type":"ContainerDied","Data":"c53f30823c37cc1a9b0fa6830b54738bbc69185b106a9a0240d593c9404b42cb"} Oct 07 19:15:04 crc kubenswrapper[4825]: I1007 19:15:04.379687 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c53f30823c37cc1a9b0fa6830b54738bbc69185b106a9a0240d593c9404b42cb" Oct 07 19:15:04 crc kubenswrapper[4825]: I1007 19:15:04.379333 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331075-rqvtx" Oct 07 19:15:04 crc kubenswrapper[4825]: I1007 19:15:04.384452 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6687d89476-w4tpc" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.490257 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-mrnwx"] Oct 07 19:15:38 crc kubenswrapper[4825]: E1007 19:15:38.491102 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f311fd9-7ab5-4cf4-80ed-701ed5d212ef" containerName="collect-profiles" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.491115 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f311fd9-7ab5-4cf4-80ed-701ed5d212ef" containerName="collect-profiles" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.491249 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f311fd9-7ab5-4cf4-80ed-701ed5d212ef" containerName="collect-profiles" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.491892 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-mrnwx" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.494435 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-tvx92" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.495512 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v4dk9"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.496756 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v4dk9" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.498094 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-799p4" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.507560 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-mrnwx"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.514099 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v4dk9"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.520082 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ksck\" (UniqueName: \"kubernetes.io/projected/ef71a3c8-e986-4f19-a234-9e9ef7749132-kube-api-access-9ksck\") pod \"cinder-operator-controller-manager-7d4d4f8d-v4dk9\" (UID: \"ef71a3c8-e986-4f19-a234-9e9ef7749132\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v4dk9" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.520156 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtnvp\" (UniqueName: \"kubernetes.io/projected/97dc66cd-4313-4951-b85c-dedd5cd2e6ba-kube-api-access-xtnvp\") pod \"barbican-operator-controller-manager-58c4cd55f4-mrnwx\" (UID: \"97dc66cd-4313-4951-b85c-dedd5cd2e6ba\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-mrnwx" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.521570 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-ln2cd"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.527294 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-ln2cd" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.530066 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-hpjlp" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.549555 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-ln2cd"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.555170 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-n28d6"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.556312 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-n28d6" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.566079 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-kg7pn" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.573191 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-jrh49"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.574250 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-jrh49" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.576478 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-vpfpn" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.591528 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-n28d6"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.600948 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-b9tp8"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.602554 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-b9tp8" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.606900 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-8swks" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.607969 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-jrh49"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.622554 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-b9tp8"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.623857 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtnvp\" (UniqueName: \"kubernetes.io/projected/97dc66cd-4313-4951-b85c-dedd5cd2e6ba-kube-api-access-xtnvp\") pod \"barbican-operator-controller-manager-58c4cd55f4-mrnwx\" (UID: \"97dc66cd-4313-4951-b85c-dedd5cd2e6ba\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-mrnwx" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.623921 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq6md\" (UniqueName: \"kubernetes.io/projected/a0f7df98-caae-40a5-bb89-94123bce0763-kube-api-access-kq6md\") pod \"glance-operator-controller-manager-5dc44df7d5-n28d6\" (UID: \"a0f7df98-caae-40a5-bb89-94123bce0763\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-n28d6" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.623984 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fllh\" (UniqueName: \"kubernetes.io/projected/c310d873-fc90-4658-ab06-ffa16a97c784-kube-api-access-4fllh\") pod \"designate-operator-controller-manager-75dfd9b554-ln2cd\" (UID: \"c310d873-fc90-4658-ab06-ffa16a97c784\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-ln2cd" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.624018 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmpvx\" (UniqueName: \"kubernetes.io/projected/282406b3-2501-4b01-adf1-d952fc240404-kube-api-access-jmpvx\") pod \"horizon-operator-controller-manager-76d5b87f47-b9tp8\" (UID: \"282406b3-2501-4b01-adf1-d952fc240404\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-b9tp8" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.624041 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ksck\" (UniqueName: \"kubernetes.io/projected/ef71a3c8-e986-4f19-a234-9e9ef7749132-kube-api-access-9ksck\") pod \"cinder-operator-controller-manager-7d4d4f8d-v4dk9\" (UID: \"ef71a3c8-e986-4f19-a234-9e9ef7749132\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v4dk9" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.624074 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h5zv\" (UniqueName: \"kubernetes.io/projected/62bd3185-8c68-419d-b523-2de43d8dd015-kube-api-access-4h5zv\") pod \"heat-operator-controller-manager-54b4974c45-jrh49\" (UID: \"62bd3185-8c68-419d-b523-2de43d8dd015\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-jrh49" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.637948 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-f9pdx"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.638918 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-f9pdx" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.640551 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-rfg28" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.645355 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.659752 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtnvp\" (UniqueName: \"kubernetes.io/projected/97dc66cd-4313-4951-b85c-dedd5cd2e6ba-kube-api-access-xtnvp\") pod \"barbican-operator-controller-manager-58c4cd55f4-mrnwx\" (UID: \"97dc66cd-4313-4951-b85c-dedd5cd2e6ba\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-mrnwx" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.660024 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-m87x4"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.673781 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-649675d675-m87x4" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.694928 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-j82k7" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.708501 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ksck\" (UniqueName: \"kubernetes.io/projected/ef71a3c8-e986-4f19-a234-9e9ef7749132-kube-api-access-9ksck\") pod \"cinder-operator-controller-manager-7d4d4f8d-v4dk9\" (UID: \"ef71a3c8-e986-4f19-a234-9e9ef7749132\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v4dk9" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.713631 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-f9pdx"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.717608 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-m87x4"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.727683 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-5mng4"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.728306 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq6md\" (UniqueName: \"kubernetes.io/projected/a0f7df98-caae-40a5-bb89-94123bce0763-kube-api-access-kq6md\") pod \"glance-operator-controller-manager-5dc44df7d5-n28d6\" (UID: \"a0f7df98-caae-40a5-bb89-94123bce0763\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-n28d6" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.728349 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fllh\" (UniqueName: \"kubernetes.io/projected/c310d873-fc90-4658-ab06-ffa16a97c784-kube-api-access-4fllh\") pod \"designate-operator-controller-manager-75dfd9b554-ln2cd\" (UID: \"c310d873-fc90-4658-ab06-ffa16a97c784\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-ln2cd" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.728371 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmpvx\" (UniqueName: \"kubernetes.io/projected/282406b3-2501-4b01-adf1-d952fc240404-kube-api-access-jmpvx\") pod \"horizon-operator-controller-manager-76d5b87f47-b9tp8\" (UID: \"282406b3-2501-4b01-adf1-d952fc240404\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-b9tp8" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.728392 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h5zv\" (UniqueName: \"kubernetes.io/projected/62bd3185-8c68-419d-b523-2de43d8dd015-kube-api-access-4h5zv\") pod \"heat-operator-controller-manager-54b4974c45-jrh49\" (UID: \"62bd3185-8c68-419d-b523-2de43d8dd015\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-jrh49" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.729326 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-5mng4" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.734642 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-6f5xw" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.734999 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-5mng4"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.739322 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-9nndm"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.741020 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-9nndm" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.749596 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.751724 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.754446 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-598bs" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.754585 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-9nndm"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.757019 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-fkhvb" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.760773 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h5zv\" (UniqueName: \"kubernetes.io/projected/62bd3185-8c68-419d-b523-2de43d8dd015-kube-api-access-4h5zv\") pod \"heat-operator-controller-manager-54b4974c45-jrh49\" (UID: \"62bd3185-8c68-419d-b523-2de43d8dd015\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-jrh49" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.760838 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmpvx\" (UniqueName: \"kubernetes.io/projected/282406b3-2501-4b01-adf1-d952fc240404-kube-api-access-jmpvx\") pod \"horizon-operator-controller-manager-76d5b87f47-b9tp8\" (UID: \"282406b3-2501-4b01-adf1-d952fc240404\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-b9tp8" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.762391 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fllh\" (UniqueName: \"kubernetes.io/projected/c310d873-fc90-4658-ab06-ffa16a97c784-kube-api-access-4fllh\") pod \"designate-operator-controller-manager-75dfd9b554-ln2cd\" (UID: \"c310d873-fc90-4658-ab06-ffa16a97c784\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-ln2cd" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.762958 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq6md\" (UniqueName: \"kubernetes.io/projected/a0f7df98-caae-40a5-bb89-94123bce0763-kube-api-access-kq6md\") pod \"glance-operator-controller-manager-5dc44df7d5-n28d6\" (UID: \"a0f7df98-caae-40a5-bb89-94123bce0763\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-n28d6" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.810402 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.815356 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-mrnwx" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.815659 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-fpkzs"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.824673 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v4dk9" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.827245 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-fpkzs"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.827376 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-fpkzs" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.829567 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mfdf\" (UniqueName: \"kubernetes.io/projected/3f63b792-0ed9-453e-8dff-afac52bac339-kube-api-access-8mfdf\") pod \"ironic-operator-controller-manager-649675d675-m87x4\" (UID: \"3f63b792-0ed9-453e-8dff-afac52bac339\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-m87x4" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.829611 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6blh4\" (UniqueName: \"kubernetes.io/projected/1b11f862-ee30-4996-a8fb-218b3c27f07a-kube-api-access-6blh4\") pod \"infra-operator-controller-manager-658588b8c9-f9pdx\" (UID: \"1b11f862-ee30-4996-a8fb-218b3c27f07a\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-f9pdx" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.829683 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b11f862-ee30-4996-a8fb-218b3c27f07a-cert\") pod \"infra-operator-controller-manager-658588b8c9-f9pdx\" (UID: \"1b11f862-ee30-4996-a8fb-218b3c27f07a\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-f9pdx" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.834535 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-vn86z"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.834958 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-hk87s" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.835636 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vn86z" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.837318 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-c5cbb" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.842532 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-vn86z"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.845167 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-ln2cd" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.868280 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-hjgxc"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.874005 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-hjgxc" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.874049 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-n28d6" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.877076 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-c4v4h" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.882205 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.883765 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.886446 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.887007 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-7pj6n" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.894164 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-jrh49" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.897639 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-9llfs"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.898945 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-9llfs" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.901620 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-jl7wl" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.904163 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-hjgxc"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.914473 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-9llfs"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.922063 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-lp8qp"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.923132 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lp8qp" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.927596 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-v6b5x" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.930831 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxxfs\" (UniqueName: \"kubernetes.io/projected/22f18be3-b165-4b14-90bd-3eac19ae3fee-kube-api-access-sxxfs\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm\" (UID: \"22f18be3-b165-4b14-90bd-3eac19ae3fee\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.930868 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs6bk\" (UniqueName: \"kubernetes.io/projected/528dd884-a7df-4574-920f-86ae0d779b62-kube-api-access-gs6bk\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-5mng4\" (UID: \"528dd884-a7df-4574-920f-86ae0d779b62\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-5mng4" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.930900 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b11f862-ee30-4996-a8fb-218b3c27f07a-cert\") pod \"infra-operator-controller-manager-658588b8c9-f9pdx\" (UID: \"1b11f862-ee30-4996-a8fb-218b3c27f07a\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-f9pdx" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.930932 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv4rt\" (UniqueName: \"kubernetes.io/projected/01529960-5bd1-4a4d-8703-8d6a3ff38d4b-kube-api-access-sv4rt\") pod \"neutron-operator-controller-manager-8d984cc4d-fpkzs\" (UID: \"01529960-5bd1-4a4d-8703-8d6a3ff38d4b\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-fpkzs" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.930953 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mfdf\" (UniqueName: \"kubernetes.io/projected/3f63b792-0ed9-453e-8dff-afac52bac339-kube-api-access-8mfdf\") pod \"ironic-operator-controller-manager-649675d675-m87x4\" (UID: \"3f63b792-0ed9-453e-8dff-afac52bac339\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-m87x4" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.930988 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6blh4\" (UniqueName: \"kubernetes.io/projected/1b11f862-ee30-4996-a8fb-218b3c27f07a-kube-api-access-6blh4\") pod \"infra-operator-controller-manager-658588b8c9-f9pdx\" (UID: \"1b11f862-ee30-4996-a8fb-218b3c27f07a\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-f9pdx" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.931026 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zptmn\" (UniqueName: \"kubernetes.io/projected/9e61b6db-a40e-4ce3-8086-e51bbc6f6295-kube-api-access-zptmn\") pod \"manila-operator-controller-manager-65d89cfd9f-9nndm\" (UID: \"9e61b6db-a40e-4ce3-8086-e51bbc6f6295\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-9nndm" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.934646 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-b9tp8" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.945653 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.951953 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b11f862-ee30-4996-a8fb-218b3c27f07a-cert\") pod \"infra-operator-controller-manager-658588b8c9-f9pdx\" (UID: \"1b11f862-ee30-4996-a8fb-218b3c27f07a\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-f9pdx" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.955110 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mfdf\" (UniqueName: \"kubernetes.io/projected/3f63b792-0ed9-453e-8dff-afac52bac339-kube-api-access-8mfdf\") pod \"ironic-operator-controller-manager-649675d675-m87x4\" (UID: \"3f63b792-0ed9-453e-8dff-afac52bac339\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-m87x4" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.955680 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-bs76l"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.957150 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bs76l" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.964915 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-lp8qp"] Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.968859 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-jvsv6" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.976140 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6blh4\" (UniqueName: \"kubernetes.io/projected/1b11f862-ee30-4996-a8fb-218b3c27f07a-kube-api-access-6blh4\") pod \"infra-operator-controller-manager-658588b8c9-f9pdx\" (UID: \"1b11f862-ee30-4996-a8fb-218b3c27f07a\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-f9pdx" Oct 07 19:15:38 crc kubenswrapper[4825]: I1007 19:15:38.979486 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-bs76l"] Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.001542 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-jvj5c"] Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.001751 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-f9pdx" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.002976 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-jvj5c" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.005504 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-jvj5c"] Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.007021 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-26rqn" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.035112 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-lg8z7"] Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.035219 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxsk6\" (UniqueName: \"kubernetes.io/projected/0ecb1a32-2936-470c-a9c5-6701d461cd71-kube-api-access-sxsk6\") pod \"octavia-operator-controller-manager-7468f855d8-hjgxc\" (UID: \"0ecb1a32-2936-470c-a9c5-6701d461cd71\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-hjgxc" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.035282 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tndc5\" (UniqueName: \"kubernetes.io/projected/8cacb372-6381-4182-92eb-81e607f7cf31-kube-api-access-tndc5\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv\" (UID: \"8cacb372-6381-4182-92eb-81e607f7cf31\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.035321 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwkkh\" (UniqueName: \"kubernetes.io/projected/0caa8db7-d83d-47bd-9276-29102dd20de8-kube-api-access-dwkkh\") pod \"nova-operator-controller-manager-7c7fc454ff-vn86z\" (UID: \"0caa8db7-d83d-47bd-9276-29102dd20de8\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vn86z" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.035340 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jlb6\" (UniqueName: \"kubernetes.io/projected/3b8778b6-81a2-4e3c-b464-6e5c8e063a4b-kube-api-access-7jlb6\") pod \"placement-operator-controller-manager-54689d9f88-lp8qp\" (UID: \"3b8778b6-81a2-4e3c-b464-6e5c8e063a4b\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lp8qp" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.035361 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zptmn\" (UniqueName: \"kubernetes.io/projected/9e61b6db-a40e-4ce3-8086-e51bbc6f6295-kube-api-access-zptmn\") pod \"manila-operator-controller-manager-65d89cfd9f-9nndm\" (UID: \"9e61b6db-a40e-4ce3-8086-e51bbc6f6295\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-9nndm" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.035390 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4fhb\" (UniqueName: \"kubernetes.io/projected/844cfe74-a770-4268-a60a-372586ac0744-kube-api-access-h4fhb\") pod \"ovn-operator-controller-manager-6d8b6f9b9-9llfs\" (UID: \"844cfe74-a770-4268-a60a-372586ac0744\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-9llfs" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.035413 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxxfs\" (UniqueName: \"kubernetes.io/projected/22f18be3-b165-4b14-90bd-3eac19ae3fee-kube-api-access-sxxfs\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm\" (UID: \"22f18be3-b165-4b14-90bd-3eac19ae3fee\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.035430 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cacb372-6381-4182-92eb-81e607f7cf31-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv\" (UID: \"8cacb372-6381-4182-92eb-81e607f7cf31\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.035458 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs6bk\" (UniqueName: \"kubernetes.io/projected/528dd884-a7df-4574-920f-86ae0d779b62-kube-api-access-gs6bk\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-5mng4\" (UID: \"528dd884-a7df-4574-920f-86ae0d779b62\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-5mng4" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.035498 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv4rt\" (UniqueName: \"kubernetes.io/projected/01529960-5bd1-4a4d-8703-8d6a3ff38d4b-kube-api-access-sv4rt\") pod \"neutron-operator-controller-manager-8d984cc4d-fpkzs\" (UID: \"01529960-5bd1-4a4d-8703-8d6a3ff38d4b\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-fpkzs" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.036640 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-649675d675-m87x4" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.040725 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-lg8z7" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.073619 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-r7lc8" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.089454 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-lg8z7"] Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.095684 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs6bk\" (UniqueName: \"kubernetes.io/projected/528dd884-a7df-4574-920f-86ae0d779b62-kube-api-access-gs6bk\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-5mng4\" (UID: \"528dd884-a7df-4574-920f-86ae0d779b62\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-5mng4" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.103530 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxxfs\" (UniqueName: \"kubernetes.io/projected/22f18be3-b165-4b14-90bd-3eac19ae3fee-kube-api-access-sxxfs\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm\" (UID: \"22f18be3-b165-4b14-90bd-3eac19ae3fee\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.103603 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zptmn\" (UniqueName: \"kubernetes.io/projected/9e61b6db-a40e-4ce3-8086-e51bbc6f6295-kube-api-access-zptmn\") pod \"manila-operator-controller-manager-65d89cfd9f-9nndm\" (UID: \"9e61b6db-a40e-4ce3-8086-e51bbc6f6295\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-9nndm" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.104474 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv4rt\" (UniqueName: \"kubernetes.io/projected/01529960-5bd1-4a4d-8703-8d6a3ff38d4b-kube-api-access-sv4rt\") pod \"neutron-operator-controller-manager-8d984cc4d-fpkzs\" (UID: \"01529960-5bd1-4a4d-8703-8d6a3ff38d4b\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-fpkzs" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.118632 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-5mng4" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.137028 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4fhb\" (UniqueName: \"kubernetes.io/projected/844cfe74-a770-4268-a60a-372586ac0744-kube-api-access-h4fhb\") pod \"ovn-operator-controller-manager-6d8b6f9b9-9llfs\" (UID: \"844cfe74-a770-4268-a60a-372586ac0744\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-9llfs" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.137095 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cacb372-6381-4182-92eb-81e607f7cf31-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv\" (UID: \"8cacb372-6381-4182-92eb-81e607f7cf31\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.137158 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbntw\" (UniqueName: \"kubernetes.io/projected/7f5bc608-3853-4a58-ac8d-18f57baffe4c-kube-api-access-tbntw\") pod \"telemetry-operator-controller-manager-5d4d74dd89-jvj5c\" (UID: \"7f5bc608-3853-4a58-ac8d-18f57baffe4c\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-jvj5c" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.137200 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8vpf\" (UniqueName: \"kubernetes.io/projected/c720cafe-11e6-4959-8228-b03cdb65242d-kube-api-access-d8vpf\") pod \"swift-operator-controller-manager-6859f9b676-bs76l\" (UID: \"c720cafe-11e6-4959-8228-b03cdb65242d\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bs76l" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.137219 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxsk6\" (UniqueName: \"kubernetes.io/projected/0ecb1a32-2936-470c-a9c5-6701d461cd71-kube-api-access-sxsk6\") pod \"octavia-operator-controller-manager-7468f855d8-hjgxc\" (UID: \"0ecb1a32-2936-470c-a9c5-6701d461cd71\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-hjgxc" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.137256 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tndc5\" (UniqueName: \"kubernetes.io/projected/8cacb372-6381-4182-92eb-81e607f7cf31-kube-api-access-tndc5\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv\" (UID: \"8cacb372-6381-4182-92eb-81e607f7cf31\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.137293 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwkkh\" (UniqueName: \"kubernetes.io/projected/0caa8db7-d83d-47bd-9276-29102dd20de8-kube-api-access-dwkkh\") pod \"nova-operator-controller-manager-7c7fc454ff-vn86z\" (UID: \"0caa8db7-d83d-47bd-9276-29102dd20de8\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vn86z" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.137312 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jlb6\" (UniqueName: \"kubernetes.io/projected/3b8778b6-81a2-4e3c-b464-6e5c8e063a4b-kube-api-access-7jlb6\") pod \"placement-operator-controller-manager-54689d9f88-lp8qp\" (UID: \"3b8778b6-81a2-4e3c-b464-6e5c8e063a4b\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lp8qp" Oct 07 19:15:39 crc kubenswrapper[4825]: E1007 19:15:39.137429 4825 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 07 19:15:39 crc kubenswrapper[4825]: E1007 19:15:39.137500 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cacb372-6381-4182-92eb-81e607f7cf31-cert podName:8cacb372-6381-4182-92eb-81e607f7cf31 nodeName:}" failed. No retries permitted until 2025-10-07 19:15:39.637480991 +0000 UTC m=+928.459519628 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8cacb372-6381-4182-92eb-81e607f7cf31-cert") pod "openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv" (UID: "8cacb372-6381-4182-92eb-81e607f7cf31") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.137666 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-9nndm" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.141834 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-zbnmp"] Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.143048 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-zbnmp" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.148574 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-zbnmp"] Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.152116 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-kx6df" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.158967 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4fhb\" (UniqueName: \"kubernetes.io/projected/844cfe74-a770-4268-a60a-372586ac0744-kube-api-access-h4fhb\") pod \"ovn-operator-controller-manager-6d8b6f9b9-9llfs\" (UID: \"844cfe74-a770-4268-a60a-372586ac0744\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-9llfs" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.158987 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxsk6\" (UniqueName: \"kubernetes.io/projected/0ecb1a32-2936-470c-a9c5-6701d461cd71-kube-api-access-sxsk6\") pod \"octavia-operator-controller-manager-7468f855d8-hjgxc\" (UID: \"0ecb1a32-2936-470c-a9c5-6701d461cd71\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-hjgxc" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.159539 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jlb6\" (UniqueName: \"kubernetes.io/projected/3b8778b6-81a2-4e3c-b464-6e5c8e063a4b-kube-api-access-7jlb6\") pod \"placement-operator-controller-manager-54689d9f88-lp8qp\" (UID: \"3b8778b6-81a2-4e3c-b464-6e5c8e063a4b\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lp8qp" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.159620 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwkkh\" (UniqueName: \"kubernetes.io/projected/0caa8db7-d83d-47bd-9276-29102dd20de8-kube-api-access-dwkkh\") pod \"nova-operator-controller-manager-7c7fc454ff-vn86z\" (UID: \"0caa8db7-d83d-47bd-9276-29102dd20de8\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vn86z" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.160255 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tndc5\" (UniqueName: \"kubernetes.io/projected/8cacb372-6381-4182-92eb-81e607f7cf31-kube-api-access-tndc5\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv\" (UID: \"8cacb372-6381-4182-92eb-81e607f7cf31\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.194201 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.214779 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-fpkzs" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.227206 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-77dffbdc98-rxjhm"] Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.227632 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vn86z" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.229420 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-rxjhm" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.237643 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-csdnm" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.237850 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.244564 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt4pm\" (UniqueName: \"kubernetes.io/projected/bd6d051a-119d-45c5-9b81-939bba328c56-kube-api-access-lt4pm\") pod \"test-operator-controller-manager-5cd5cb47d7-lg8z7\" (UID: \"bd6d051a-119d-45c5-9b81-939bba328c56\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-lg8z7" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.244635 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbntw\" (UniqueName: \"kubernetes.io/projected/7f5bc608-3853-4a58-ac8d-18f57baffe4c-kube-api-access-tbntw\") pod \"telemetry-operator-controller-manager-5d4d74dd89-jvj5c\" (UID: \"7f5bc608-3853-4a58-ac8d-18f57baffe4c\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-jvj5c" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.244666 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr5f6\" (UniqueName: \"kubernetes.io/projected/f99d1a15-090e-4a5e-a210-690be64c4742-kube-api-access-cr5f6\") pod \"watcher-operator-controller-manager-6cbc6dd547-zbnmp\" (UID: \"f99d1a15-090e-4a5e-a210-690be64c4742\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-zbnmp" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.244827 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8vpf\" (UniqueName: \"kubernetes.io/projected/c720cafe-11e6-4959-8228-b03cdb65242d-kube-api-access-d8vpf\") pod \"swift-operator-controller-manager-6859f9b676-bs76l\" (UID: \"c720cafe-11e6-4959-8228-b03cdb65242d\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bs76l" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.246615 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-77dffbdc98-rxjhm"] Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.249236 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-hjgxc" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.266553 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8vpf\" (UniqueName: \"kubernetes.io/projected/c720cafe-11e6-4959-8228-b03cdb65242d-kube-api-access-d8vpf\") pod \"swift-operator-controller-manager-6859f9b676-bs76l\" (UID: \"c720cafe-11e6-4959-8228-b03cdb65242d\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bs76l" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.295939 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbntw\" (UniqueName: \"kubernetes.io/projected/7f5bc608-3853-4a58-ac8d-18f57baffe4c-kube-api-access-tbntw\") pod \"telemetry-operator-controller-manager-5d4d74dd89-jvj5c\" (UID: \"7f5bc608-3853-4a58-ac8d-18f57baffe4c\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-jvj5c" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.296416 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rgt49"] Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.297303 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rgt49" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.301196 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-cmrxd" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.308445 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rgt49"] Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.308641 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-9llfs" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.328256 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lp8qp" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.348632 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr5f6\" (UniqueName: \"kubernetes.io/projected/f99d1a15-090e-4a5e-a210-690be64c4742-kube-api-access-cr5f6\") pod \"watcher-operator-controller-manager-6cbc6dd547-zbnmp\" (UID: \"f99d1a15-090e-4a5e-a210-690be64c4742\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-zbnmp" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.348720 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxq5c\" (UniqueName: \"kubernetes.io/projected/063bceb1-c26d-453a-a74a-e6874c273034-kube-api-access-pxq5c\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-rgt49\" (UID: \"063bceb1-c26d-453a-a74a-e6874c273034\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rgt49" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.348829 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgsgw\" (UniqueName: \"kubernetes.io/projected/3463b9a9-3935-4a41-b710-77084296fa18-kube-api-access-lgsgw\") pod \"openstack-operator-controller-manager-77dffbdc98-rxjhm\" (UID: \"3463b9a9-3935-4a41-b710-77084296fa18\") " pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-rxjhm" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.348853 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3463b9a9-3935-4a41-b710-77084296fa18-cert\") pod \"openstack-operator-controller-manager-77dffbdc98-rxjhm\" (UID: \"3463b9a9-3935-4a41-b710-77084296fa18\") " pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-rxjhm" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.348979 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt4pm\" (UniqueName: \"kubernetes.io/projected/bd6d051a-119d-45c5-9b81-939bba328c56-kube-api-access-lt4pm\") pod \"test-operator-controller-manager-5cd5cb47d7-lg8z7\" (UID: \"bd6d051a-119d-45c5-9b81-939bba328c56\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-lg8z7" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.382497 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr5f6\" (UniqueName: \"kubernetes.io/projected/f99d1a15-090e-4a5e-a210-690be64c4742-kube-api-access-cr5f6\") pod \"watcher-operator-controller-manager-6cbc6dd547-zbnmp\" (UID: \"f99d1a15-090e-4a5e-a210-690be64c4742\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-zbnmp" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.392753 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt4pm\" (UniqueName: \"kubernetes.io/projected/bd6d051a-119d-45c5-9b81-939bba328c56-kube-api-access-lt4pm\") pod \"test-operator-controller-manager-5cd5cb47d7-lg8z7\" (UID: \"bd6d051a-119d-45c5-9b81-939bba328c56\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-lg8z7" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.450138 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxq5c\" (UniqueName: \"kubernetes.io/projected/063bceb1-c26d-453a-a74a-e6874c273034-kube-api-access-pxq5c\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-rgt49\" (UID: \"063bceb1-c26d-453a-a74a-e6874c273034\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rgt49" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.450249 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgsgw\" (UniqueName: \"kubernetes.io/projected/3463b9a9-3935-4a41-b710-77084296fa18-kube-api-access-lgsgw\") pod \"openstack-operator-controller-manager-77dffbdc98-rxjhm\" (UID: \"3463b9a9-3935-4a41-b710-77084296fa18\") " pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-rxjhm" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.450279 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3463b9a9-3935-4a41-b710-77084296fa18-cert\") pod \"openstack-operator-controller-manager-77dffbdc98-rxjhm\" (UID: \"3463b9a9-3935-4a41-b710-77084296fa18\") " pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-rxjhm" Oct 07 19:15:39 crc kubenswrapper[4825]: E1007 19:15:39.450449 4825 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 07 19:15:39 crc kubenswrapper[4825]: E1007 19:15:39.450504 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3463b9a9-3935-4a41-b710-77084296fa18-cert podName:3463b9a9-3935-4a41-b710-77084296fa18 nodeName:}" failed. No retries permitted until 2025-10-07 19:15:39.950486834 +0000 UTC m=+928.772525471 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3463b9a9-3935-4a41-b710-77084296fa18-cert") pod "openstack-operator-controller-manager-77dffbdc98-rxjhm" (UID: "3463b9a9-3935-4a41-b710-77084296fa18") : secret "webhook-server-cert" not found Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.460452 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bs76l" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.465558 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgsgw\" (UniqueName: \"kubernetes.io/projected/3463b9a9-3935-4a41-b710-77084296fa18-kube-api-access-lgsgw\") pod \"openstack-operator-controller-manager-77dffbdc98-rxjhm\" (UID: \"3463b9a9-3935-4a41-b710-77084296fa18\") " pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-rxjhm" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.472892 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxq5c\" (UniqueName: \"kubernetes.io/projected/063bceb1-c26d-453a-a74a-e6874c273034-kube-api-access-pxq5c\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-rgt49\" (UID: \"063bceb1-c26d-453a-a74a-e6874c273034\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rgt49" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.482406 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-jvj5c" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.515296 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v4dk9"] Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.523398 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-lg8z7" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.549271 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-zbnmp" Oct 07 19:15:39 crc kubenswrapper[4825]: W1007 19:15:39.577395 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef71a3c8_e986_4f19_a234_9e9ef7749132.slice/crio-c58978e2f9d0e4de1b468bdae6eacc2d0d04302642e40a23dd9a4785664679ba WatchSource:0}: Error finding container c58978e2f9d0e4de1b468bdae6eacc2d0d04302642e40a23dd9a4785664679ba: Status 404 returned error can't find the container with id c58978e2f9d0e4de1b468bdae6eacc2d0d04302642e40a23dd9a4785664679ba Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.580206 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-mrnwx"] Oct 07 19:15:39 crc kubenswrapper[4825]: W1007 19:15:39.612593 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97dc66cd_4313_4951_b85c_dedd5cd2e6ba.slice/crio-3d904badd9629bac841dbbc08a44f9d4f3c441fd63c7779ea935b576f1fad564 WatchSource:0}: Error finding container 3d904badd9629bac841dbbc08a44f9d4f3c441fd63c7779ea935b576f1fad564: Status 404 returned error can't find the container with id 3d904badd9629bac841dbbc08a44f9d4f3c441fd63c7779ea935b576f1fad564 Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.649155 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rgt49" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.654069 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cacb372-6381-4182-92eb-81e607f7cf31-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv\" (UID: \"8cacb372-6381-4182-92eb-81e607f7cf31\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.661017 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cacb372-6381-4182-92eb-81e607f7cf31-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv\" (UID: \"8cacb372-6381-4182-92eb-81e607f7cf31\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.664178 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v4dk9" event={"ID":"ef71a3c8-e986-4f19-a234-9e9ef7749132","Type":"ContainerStarted","Data":"c58978e2f9d0e4de1b468bdae6eacc2d0d04302642e40a23dd9a4785664679ba"} Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.695085 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-mrnwx" event={"ID":"97dc66cd-4313-4951-b85c-dedd5cd2e6ba","Type":"ContainerStarted","Data":"3d904badd9629bac841dbbc08a44f9d4f3c441fd63c7779ea935b576f1fad564"} Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.880068 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.961103 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3463b9a9-3935-4a41-b710-77084296fa18-cert\") pod \"openstack-operator-controller-manager-77dffbdc98-rxjhm\" (UID: \"3463b9a9-3935-4a41-b710-77084296fa18\") " pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-rxjhm" Oct 07 19:15:39 crc kubenswrapper[4825]: I1007 19:15:39.971767 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3463b9a9-3935-4a41-b710-77084296fa18-cert\") pod \"openstack-operator-controller-manager-77dffbdc98-rxjhm\" (UID: \"3463b9a9-3935-4a41-b710-77084296fa18\") " pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-rxjhm" Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.067274 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-ln2cd"] Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.071883 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-jrh49"] Oct 07 19:15:40 crc kubenswrapper[4825]: W1007 19:15:40.079757 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc310d873_fc90_4658_ab06_ffa16a97c784.slice/crio-156494bbd4609c771167f95a6c1a219d88a4a16e2a38dd47c8b0d1fe06160b4f WatchSource:0}: Error finding container 156494bbd4609c771167f95a6c1a219d88a4a16e2a38dd47c8b0d1fe06160b4f: Status 404 returned error can't find the container with id 156494bbd4609c771167f95a6c1a219d88a4a16e2a38dd47c8b0d1fe06160b4f Oct 07 19:15:40 crc kubenswrapper[4825]: W1007 19:15:40.175330 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod528dd884_a7df_4574_920f_86ae0d779b62.slice/crio-0e9c3dac78a851965f5c342f7f78fe8b6113b149911657586c93cda021f21371 WatchSource:0}: Error finding container 0e9c3dac78a851965f5c342f7f78fe8b6113b149911657586c93cda021f21371: Status 404 returned error can't find the container with id 0e9c3dac78a851965f5c342f7f78fe8b6113b149911657586c93cda021f21371 Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.179562 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-b9tp8"] Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.185053 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-rxjhm" Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.185606 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-m87x4"] Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.190250 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-f9pdx"] Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.196214 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-5mng4"] Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.205247 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-n28d6"] Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.215361 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-9nndm"] Oct 07 19:15:40 crc kubenswrapper[4825]: W1007 19:15:40.224666 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e61b6db_a40e_4ce3_8086_e51bbc6f6295.slice/crio-50e16c365d4a5c3da367712ed9abca60c14bdffbf0a00399f6b0eb5c2f6fed74 WatchSource:0}: Error finding container 50e16c365d4a5c3da367712ed9abca60c14bdffbf0a00399f6b0eb5c2f6fed74: Status 404 returned error can't find the container with id 50e16c365d4a5c3da367712ed9abca60c14bdffbf0a00399f6b0eb5c2f6fed74 Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.377690 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-9llfs"] Oct 07 19:15:40 crc kubenswrapper[4825]: W1007 19:15:40.386861 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod844cfe74_a770_4268_a60a_372586ac0744.slice/crio-36e8f797183d3591242c47bb3581ffabdb29980a4f2f4e437e97e9dcfcba4b82 WatchSource:0}: Error finding container 36e8f797183d3591242c47bb3581ffabdb29980a4f2f4e437e97e9dcfcba4b82: Status 404 returned error can't find the container with id 36e8f797183d3591242c47bb3581ffabdb29980a4f2f4e437e97e9dcfcba4b82 Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.389515 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-fpkzs"] Oct 07 19:15:40 crc kubenswrapper[4825]: W1007 19:15:40.393717 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc720cafe_11e6_4959_8228_b03cdb65242d.slice/crio-cd076d7a7b917d2a2157fa4b1c48c57e463e5bfbe83cc1c160312b753e144d66 WatchSource:0}: Error finding container cd076d7a7b917d2a2157fa4b1c48c57e463e5bfbe83cc1c160312b753e144d66: Status 404 returned error can't find the container with id cd076d7a7b917d2a2157fa4b1c48c57e463e5bfbe83cc1c160312b753e144d66 Oct 07 19:15:40 crc kubenswrapper[4825]: W1007 19:15:40.402048 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b8778b6_81a2_4e3c_b464_6e5c8e063a4b.slice/crio-6842e9a4e7d514c22361d86af670523bc0c43ac3dcebc691f6a210a75a631c8f WatchSource:0}: Error finding container 6842e9a4e7d514c22361d86af670523bc0c43ac3dcebc691f6a210a75a631c8f: Status 404 returned error can't find the container with id 6842e9a4e7d514c22361d86af670523bc0c43ac3dcebc691f6a210a75a631c8f Oct 07 19:15:40 crc kubenswrapper[4825]: W1007 19:15:40.403810 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0caa8db7_d83d_47bd_9276_29102dd20de8.slice/crio-1ad0dfd11c7a1abe0f51db3dce48cb3318ad7e90379c05c81d6a66fbd296eabb WatchSource:0}: Error finding container 1ad0dfd11c7a1abe0f51db3dce48cb3318ad7e90379c05c81d6a66fbd296eabb: Status 404 returned error can't find the container with id 1ad0dfd11c7a1abe0f51db3dce48cb3318ad7e90379c05c81d6a66fbd296eabb Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.405428 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-vn86z"] Oct 07 19:15:40 crc kubenswrapper[4825]: E1007 19:15:40.406568 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:e4c4ff39c54c0af231fb781759ab50ed86285c74d38bdea43fa75646b762d842,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dwkkh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-7c7fc454ff-vn86z_openstack-operators(0caa8db7-d83d-47bd-9276-29102dd20de8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.411724 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-lp8qp"] Oct 07 19:15:40 crc kubenswrapper[4825]: W1007 19:15:40.413583 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22f18be3_b165_4b14_90bd_3eac19ae3fee.slice/crio-ba1b25b86107aaba7d4779a1697da7e0aba491dc9b6eb37afcb7623683954ccf WatchSource:0}: Error finding container ba1b25b86107aaba7d4779a1697da7e0aba491dc9b6eb37afcb7623683954ccf: Status 404 returned error can't find the container with id ba1b25b86107aaba7d4779a1697da7e0aba491dc9b6eb37afcb7623683954ccf Oct 07 19:15:40 crc kubenswrapper[4825]: E1007 19:15:40.413972 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7jlb6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-54689d9f88-lp8qp_openstack-operators(3b8778b6-81a2-4e3c-b464-6e5c8e063a4b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 19:15:40 crc kubenswrapper[4825]: W1007 19:15:40.414404 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ecb1a32_2936_470c_a9c5_6701d461cd71.slice/crio-826e22b76931063620c99f69f9ebce8ccf3dc1729f1489d3aea46f2a849bf2b0 WatchSource:0}: Error finding container 826e22b76931063620c99f69f9ebce8ccf3dc1729f1489d3aea46f2a849bf2b0: Status 404 returned error can't find the container with id 826e22b76931063620c99f69f9ebce8ccf3dc1729f1489d3aea46f2a849bf2b0 Oct 07 19:15:40 crc kubenswrapper[4825]: E1007 19:15:40.415172 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:5f96b563a63494082323bfced089d6589e0c89db43c6a39a2e912c79b1a278fe,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sxxfs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm_openstack-operators(22f18be3-b165-4b14-90bd-3eac19ae3fee): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 19:15:40 crc kubenswrapper[4825]: E1007 19:15:40.416304 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:da5c3078d80878d66c616e6f8a0bb909f95d971cde2c612f96fded064113e182,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sxsk6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7468f855d8-hjgxc_openstack-operators(0ecb1a32-2936-470c-a9c5-6701d461cd71): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.416898 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-bs76l"] Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.421157 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm"] Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.441410 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-hjgxc"] Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.550243 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-lg8z7"] Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.557112 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-zbnmp"] Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.570609 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rgt49"] Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.575081 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv"] Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.578141 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-jvj5c"] Oct 07 19:15:40 crc kubenswrapper[4825]: E1007 19:15:40.580694 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cr5f6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6cbc6dd547-zbnmp_openstack-operators(f99d1a15-090e-4a5e-a210-690be64c4742): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 19:15:40 crc kubenswrapper[4825]: W1007 19:15:40.588628 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod063bceb1_c26d_453a_a74a_e6874c273034.slice/crio-0fc760964f01b1ab0646b7c98c22e05f9bcf6ddcfd4d4d73c4e875393c24fdaf WatchSource:0}: Error finding container 0fc760964f01b1ab0646b7c98c22e05f9bcf6ddcfd4d4d73c4e875393c24fdaf: Status 404 returned error can't find the container with id 0fc760964f01b1ab0646b7c98c22e05f9bcf6ddcfd4d4d73c4e875393c24fdaf Oct 07 19:15:40 crc kubenswrapper[4825]: W1007 19:15:40.592510 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f5bc608_3853_4a58_ac8d_18f57baffe4c.slice/crio-57cb28d191cef5477cc8621a972c663548043b4eccb1e5b815a0b767408df4ac WatchSource:0}: Error finding container 57cb28d191cef5477cc8621a972c663548043b4eccb1e5b815a0b767408df4ac: Status 404 returned error can't find the container with id 57cb28d191cef5477cc8621a972c663548043b4eccb1e5b815a0b767408df4ac Oct 07 19:15:40 crc kubenswrapper[4825]: E1007 19:15:40.592673 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pxq5c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-rgt49_openstack-operators(063bceb1-c26d-453a-a74a-e6874c273034): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 19:15:40 crc kubenswrapper[4825]: W1007 19:15:40.592961 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cacb372_6381_4182_92eb_81e607f7cf31.slice/crio-84e9b7f5efbe1b6de81721b9527e7d64bbdfed9f37298369d71df92fba12023c WatchSource:0}: Error finding container 84e9b7f5efbe1b6de81721b9527e7d64bbdfed9f37298369d71df92fba12023c: Status 404 returned error can't find the container with id 84e9b7f5efbe1b6de81721b9527e7d64bbdfed9f37298369d71df92fba12023c Oct 07 19:15:40 crc kubenswrapper[4825]: E1007 19:15:40.593971 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rgt49" podUID="063bceb1-c26d-453a-a74a-e6874c273034" Oct 07 19:15:40 crc kubenswrapper[4825]: E1007 19:15:40.642411 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-hjgxc" podUID="0ecb1a32-2936-470c-a9c5-6701d461cd71" Oct 07 19:15:40 crc kubenswrapper[4825]: E1007 19:15:40.646595 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tbntw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5d4d74dd89-jvj5c_openstack-operators(7f5bc608-3853-4a58-ac8d-18f57baffe4c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 19:15:40 crc kubenswrapper[4825]: E1007 19:15:40.646964 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tndc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv_openstack-operators(8cacb372-6381-4182-92eb-81e607f7cf31): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 19:15:40 crc kubenswrapper[4825]: E1007 19:15:40.657312 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vn86z" podUID="0caa8db7-d83d-47bd-9276-29102dd20de8" Oct 07 19:15:40 crc kubenswrapper[4825]: E1007 19:15:40.677601 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lp8qp" podUID="3b8778b6-81a2-4e3c-b464-6e5c8e063a4b" Oct 07 19:15:40 crc kubenswrapper[4825]: E1007 19:15:40.701673 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm" podUID="22f18be3-b165-4b14-90bd-3eac19ae3fee" Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.778570 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-fpkzs" event={"ID":"01529960-5bd1-4a4d-8703-8d6a3ff38d4b","Type":"ContainerStarted","Data":"b40da66d55ad634c8f25296c64fcf8a6dd9ce9a93bc6199ad31ba233f1e6f7e2"} Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.781952 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-9nndm" event={"ID":"9e61b6db-a40e-4ce3-8086-e51bbc6f6295","Type":"ContainerStarted","Data":"50e16c365d4a5c3da367712ed9abca60c14bdffbf0a00399f6b0eb5c2f6fed74"} Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.786096 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-77dffbdc98-rxjhm"] Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.789586 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-9llfs" event={"ID":"844cfe74-a770-4268-a60a-372586ac0744","Type":"ContainerStarted","Data":"36e8f797183d3591242c47bb3581ffabdb29980a4f2f4e437e97e9dcfcba4b82"} Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.814544 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lp8qp" event={"ID":"3b8778b6-81a2-4e3c-b464-6e5c8e063a4b","Type":"ContainerStarted","Data":"2b91f2b5a0dd37def2d1868ef85130073a477957116cd3a79d3a9e078cfa1b65"} Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.814583 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lp8qp" event={"ID":"3b8778b6-81a2-4e3c-b464-6e5c8e063a4b","Type":"ContainerStarted","Data":"6842e9a4e7d514c22361d86af670523bc0c43ac3dcebc691f6a210a75a631c8f"} Oct 07 19:15:40 crc kubenswrapper[4825]: E1007 19:15:40.817097 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1\\\"\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lp8qp" podUID="3b8778b6-81a2-4e3c-b464-6e5c8e063a4b" Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.820594 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rgt49" event={"ID":"063bceb1-c26d-453a-a74a-e6874c273034","Type":"ContainerStarted","Data":"0fc760964f01b1ab0646b7c98c22e05f9bcf6ddcfd4d4d73c4e875393c24fdaf"} Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.822930 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-jvj5c" event={"ID":"7f5bc608-3853-4a58-ac8d-18f57baffe4c","Type":"ContainerStarted","Data":"57cb28d191cef5477cc8621a972c663548043b4eccb1e5b815a0b767408df4ac"} Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.827366 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-n28d6" event={"ID":"a0f7df98-caae-40a5-bb89-94123bce0763","Type":"ContainerStarted","Data":"7633989cd6fa4082a681a39c57aaed849d2d0b8c2961acd3b2360a0c0ac5fb4c"} Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.834884 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-jrh49" event={"ID":"62bd3185-8c68-419d-b523-2de43d8dd015","Type":"ContainerStarted","Data":"fb63c00a9851490051ff1b6b9fb590d6bf5d555b833d7f4e1ac73f25d7ef834c"} Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.837538 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-b9tp8" event={"ID":"282406b3-2501-4b01-adf1-d952fc240404","Type":"ContainerStarted","Data":"05a1057237e5b639366a179ea9adc4c71ec57da0f09482462a7b55884a370096"} Oct 07 19:15:40 crc kubenswrapper[4825]: E1007 19:15:40.838874 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rgt49" podUID="063bceb1-c26d-453a-a74a-e6874c273034" Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.840509 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-zbnmp" event={"ID":"f99d1a15-090e-4a5e-a210-690be64c4742","Type":"ContainerStarted","Data":"bdd8823835b9a4aee70c4527a201c5a1c2d75b694d018cd2d3155ec799bb8478"} Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.847236 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-hjgxc" event={"ID":"0ecb1a32-2936-470c-a9c5-6701d461cd71","Type":"ContainerStarted","Data":"e7cd2872158d6e989216d34938c4f1edc452b142144eabed6c67516af5524799"} Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.847266 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-hjgxc" event={"ID":"0ecb1a32-2936-470c-a9c5-6701d461cd71","Type":"ContainerStarted","Data":"826e22b76931063620c99f69f9ebce8ccf3dc1729f1489d3aea46f2a849bf2b0"} Oct 07 19:15:40 crc kubenswrapper[4825]: E1007 19:15:40.849482 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:da5c3078d80878d66c616e6f8a0bb909f95d971cde2c612f96fded064113e182\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-hjgxc" podUID="0ecb1a32-2936-470c-a9c5-6701d461cd71" Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.850646 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm" event={"ID":"22f18be3-b165-4b14-90bd-3eac19ae3fee","Type":"ContainerStarted","Data":"0dea16bce8b30615d5c6e99df0def755ad8a89d3e75798165ae261c1d7bf18e2"} Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.850664 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm" event={"ID":"22f18be3-b165-4b14-90bd-3eac19ae3fee","Type":"ContainerStarted","Data":"ba1b25b86107aaba7d4779a1697da7e0aba491dc9b6eb37afcb7623683954ccf"} Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.852987 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv" event={"ID":"8cacb372-6381-4182-92eb-81e607f7cf31","Type":"ContainerStarted","Data":"84e9b7f5efbe1b6de81721b9527e7d64bbdfed9f37298369d71df92fba12023c"} Oct 07 19:15:40 crc kubenswrapper[4825]: E1007 19:15:40.854198 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:5f96b563a63494082323bfced089d6589e0c89db43c6a39a2e912c79b1a278fe\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm" podUID="22f18be3-b165-4b14-90bd-3eac19ae3fee" Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.856032 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-ln2cd" event={"ID":"c310d873-fc90-4658-ab06-ffa16a97c784","Type":"ContainerStarted","Data":"156494bbd4609c771167f95a6c1a219d88a4a16e2a38dd47c8b0d1fe06160b4f"} Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.870974 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-f9pdx" event={"ID":"1b11f862-ee30-4996-a8fb-218b3c27f07a","Type":"ContainerStarted","Data":"4ebe21fab6129b8e60aeccfe3e9ff9f6118448cfbdafa47149476eb5b014820a"} Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.875535 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-lg8z7" event={"ID":"bd6d051a-119d-45c5-9b81-939bba328c56","Type":"ContainerStarted","Data":"dfc46316229b2156077fd6c2b8c3626883498caf15874816a158e23b2df05bbe"} Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.879867 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bs76l" event={"ID":"c720cafe-11e6-4959-8228-b03cdb65242d","Type":"ContainerStarted","Data":"cd076d7a7b917d2a2157fa4b1c48c57e463e5bfbe83cc1c160312b753e144d66"} Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.886300 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-5mng4" event={"ID":"528dd884-a7df-4574-920f-86ae0d779b62","Type":"ContainerStarted","Data":"0e9c3dac78a851965f5c342f7f78fe8b6113b149911657586c93cda021f21371"} Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.889321 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vn86z" event={"ID":"0caa8db7-d83d-47bd-9276-29102dd20de8","Type":"ContainerStarted","Data":"d1ea239f4baf9bbe3e5a9c44d951d3e322c36bf54a0fa560c3c1c179d141573f"} Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.889352 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vn86z" event={"ID":"0caa8db7-d83d-47bd-9276-29102dd20de8","Type":"ContainerStarted","Data":"1ad0dfd11c7a1abe0f51db3dce48cb3318ad7e90379c05c81d6a66fbd296eabb"} Oct 07 19:15:40 crc kubenswrapper[4825]: E1007 19:15:40.890481 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:e4c4ff39c54c0af231fb781759ab50ed86285c74d38bdea43fa75646b762d842\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vn86z" podUID="0caa8db7-d83d-47bd-9276-29102dd20de8" Oct 07 19:15:40 crc kubenswrapper[4825]: I1007 19:15:40.895975 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-m87x4" event={"ID":"3f63b792-0ed9-453e-8dff-afac52bac339","Type":"ContainerStarted","Data":"338d3ef5d7a5cee91cb7e78b98a3033172e3fd0253cac61e251ae5685e89759e"} Oct 07 19:15:40 crc kubenswrapper[4825]: E1007 19:15:40.953147 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-zbnmp" podUID="f99d1a15-090e-4a5e-a210-690be64c4742" Oct 07 19:15:41 crc kubenswrapper[4825]: E1007 19:15:41.042663 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv" podUID="8cacb372-6381-4182-92eb-81e607f7cf31" Oct 07 19:15:41 crc kubenswrapper[4825]: E1007 19:15:41.060810 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-jvj5c" podUID="7f5bc608-3853-4a58-ac8d-18f57baffe4c" Oct 07 19:15:41 crc kubenswrapper[4825]: I1007 19:15:41.917274 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv" event={"ID":"8cacb372-6381-4182-92eb-81e607f7cf31","Type":"ContainerStarted","Data":"3ddb77c163f4ecfdbcdf72b212afddaf5ac104669d37dadc45b194d0e993b7b3"} Oct 07 19:15:41 crc kubenswrapper[4825]: E1007 19:15:41.919750 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv" podUID="8cacb372-6381-4182-92eb-81e607f7cf31" Oct 07 19:15:41 crc kubenswrapper[4825]: I1007 19:15:41.924490 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-zbnmp" event={"ID":"f99d1a15-090e-4a5e-a210-690be64c4742","Type":"ContainerStarted","Data":"38ede5fd4e40c3a2721557a26a0944c9e718fed3ecef8d584aecc3e0abcfe72a"} Oct 07 19:15:41 crc kubenswrapper[4825]: E1007 19:15:41.926217 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-zbnmp" podUID="f99d1a15-090e-4a5e-a210-690be64c4742" Oct 07 19:15:41 crc kubenswrapper[4825]: I1007 19:15:41.932288 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-rxjhm" event={"ID":"3463b9a9-3935-4a41-b710-77084296fa18","Type":"ContainerStarted","Data":"631f3a16a4c5a625f0371809d4d09af03bf5dca64870af6ed1b34fbe51ccfad9"} Oct 07 19:15:41 crc kubenswrapper[4825]: I1007 19:15:41.932331 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-rxjhm" event={"ID":"3463b9a9-3935-4a41-b710-77084296fa18","Type":"ContainerStarted","Data":"1030fc89b06e5369ba4f109c1e32810f28cd81a2e08684efd6a5ccf0cb1bc36d"} Oct 07 19:15:41 crc kubenswrapper[4825]: I1007 19:15:41.932342 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-rxjhm" event={"ID":"3463b9a9-3935-4a41-b710-77084296fa18","Type":"ContainerStarted","Data":"8952132554a29cd1d3b1399618debbda0b9d58589c5d4a9b0a6b844add51f067"} Oct 07 19:15:41 crc kubenswrapper[4825]: I1007 19:15:41.933757 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-rxjhm" Oct 07 19:15:41 crc kubenswrapper[4825]: I1007 19:15:41.946213 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-jvj5c" event={"ID":"7f5bc608-3853-4a58-ac8d-18f57baffe4c","Type":"ContainerStarted","Data":"fbedc230a2fb13cb92a64b7d591f84a19e62232769135a5eefaa709af2a0dd24"} Oct 07 19:15:41 crc kubenswrapper[4825]: E1007 19:15:41.955606 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1\\\"\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lp8qp" podUID="3b8778b6-81a2-4e3c-b464-6e5c8e063a4b" Oct 07 19:15:41 crc kubenswrapper[4825]: E1007 19:15:41.955852 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:e4c4ff39c54c0af231fb781759ab50ed86285c74d38bdea43fa75646b762d842\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vn86z" podUID="0caa8db7-d83d-47bd-9276-29102dd20de8" Oct 07 19:15:41 crc kubenswrapper[4825]: E1007 19:15:41.955900 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-jvj5c" podUID="7f5bc608-3853-4a58-ac8d-18f57baffe4c" Oct 07 19:15:41 crc kubenswrapper[4825]: E1007 19:15:41.955972 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:da5c3078d80878d66c616e6f8a0bb909f95d971cde2c612f96fded064113e182\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-hjgxc" podUID="0ecb1a32-2936-470c-a9c5-6701d461cd71" Oct 07 19:15:41 crc kubenswrapper[4825]: E1007 19:15:41.956012 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rgt49" podUID="063bceb1-c26d-453a-a74a-e6874c273034" Oct 07 19:15:41 crc kubenswrapper[4825]: E1007 19:15:41.956579 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:5f96b563a63494082323bfced089d6589e0c89db43c6a39a2e912c79b1a278fe\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm" podUID="22f18be3-b165-4b14-90bd-3eac19ae3fee" Oct 07 19:15:42 crc kubenswrapper[4825]: I1007 19:15:42.221224 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-rxjhm" podStartSLOduration=3.221204959 podStartE2EDuration="3.221204959s" podCreationTimestamp="2025-10-07 19:15:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:15:42.213728959 +0000 UTC m=+931.035767606" watchObservedRunningTime="2025-10-07 19:15:42.221204959 +0000 UTC m=+931.043243596" Oct 07 19:15:42 crc kubenswrapper[4825]: E1007 19:15:42.954865 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv" podUID="8cacb372-6381-4182-92eb-81e607f7cf31" Oct 07 19:15:42 crc kubenswrapper[4825]: E1007 19:15:42.955144 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-zbnmp" podUID="f99d1a15-090e-4a5e-a210-690be64c4742" Oct 07 19:15:42 crc kubenswrapper[4825]: E1007 19:15:42.955677 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-jvj5c" podUID="7f5bc608-3853-4a58-ac8d-18f57baffe4c" Oct 07 19:15:50 crc kubenswrapper[4825]: I1007 19:15:50.193685 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-rxjhm" Oct 07 19:15:53 crc kubenswrapper[4825]: I1007 19:15:53.037724 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-5mng4" event={"ID":"528dd884-a7df-4574-920f-86ae0d779b62","Type":"ContainerStarted","Data":"13a0ff25a52186c146bfaff1c59b88302c79f824643b605c5462cefa14335e2d"} Oct 07 19:15:53 crc kubenswrapper[4825]: I1007 19:15:53.073581 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-lg8z7" event={"ID":"bd6d051a-119d-45c5-9b81-939bba328c56","Type":"ContainerStarted","Data":"35914e2d0764226b72bac674c390121fc92a9b83dc269d968497620ed16ae30b"} Oct 07 19:15:53 crc kubenswrapper[4825]: I1007 19:15:53.084583 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-b9tp8" event={"ID":"282406b3-2501-4b01-adf1-d952fc240404","Type":"ContainerStarted","Data":"413e2746197cfa07d0c1dc19f9bcd431c785f9ab11931d7888c2e5e35efdc246"} Oct 07 19:15:53 crc kubenswrapper[4825]: I1007 19:15:53.106399 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-ln2cd" event={"ID":"c310d873-fc90-4658-ab06-ffa16a97c784","Type":"ContainerStarted","Data":"d1392413b30d5093fbeddd6b9a959e7601d9b210666b98eee1107eafaf84defd"} Oct 07 19:15:53 crc kubenswrapper[4825]: I1007 19:15:53.115475 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-mrnwx" event={"ID":"97dc66cd-4313-4951-b85c-dedd5cd2e6ba","Type":"ContainerStarted","Data":"ab6257bc542831407b415352d28f4050ad052df614f095e2854fa4dc387751fd"} Oct 07 19:15:53 crc kubenswrapper[4825]: I1007 19:15:53.130577 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-fpkzs" event={"ID":"01529960-5bd1-4a4d-8703-8d6a3ff38d4b","Type":"ContainerStarted","Data":"f1177ef367c73513723f8f22fb94970646df8ac928559431a5c63670721d1a9c"} Oct 07 19:15:53 crc kubenswrapper[4825]: I1007 19:15:53.131945 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-9llfs" event={"ID":"844cfe74-a770-4268-a60a-372586ac0744","Type":"ContainerStarted","Data":"9352c9d0b13b034380e2df28c2ce5c32a59bad01df44ff585a12d0558f285322"} Oct 07 19:15:53 crc kubenswrapper[4825]: I1007 19:15:53.139497 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-jrh49" event={"ID":"62bd3185-8c68-419d-b523-2de43d8dd015","Type":"ContainerStarted","Data":"4925835803fd9b3c649d88e0fde55d42282691375911588e83acd26494f346b2"} Oct 07 19:15:53 crc kubenswrapper[4825]: I1007 19:15:53.139558 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-jrh49" event={"ID":"62bd3185-8c68-419d-b523-2de43d8dd015","Type":"ContainerStarted","Data":"3ea4efa8fca786355774570b17ad6cd98b97aeae5866237d28939286354ff32d"} Oct 07 19:15:53 crc kubenswrapper[4825]: I1007 19:15:53.140529 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-jrh49" Oct 07 19:15:53 crc kubenswrapper[4825]: I1007 19:15:53.160688 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-9nndm" event={"ID":"9e61b6db-a40e-4ce3-8086-e51bbc6f6295","Type":"ContainerStarted","Data":"2ccedab890284d567f65d0bf98eaed42bf994968f46c3c8c16b097812aeebc0a"} Oct 07 19:15:53 crc kubenswrapper[4825]: I1007 19:15:53.192203 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-jrh49" podStartSLOduration=3.397050251 podStartE2EDuration="15.192180316s" podCreationTimestamp="2025-10-07 19:15:38 +0000 UTC" firstStartedPulling="2025-10-07 19:15:40.080672306 +0000 UTC m=+928.902710933" lastFinishedPulling="2025-10-07 19:15:51.875802321 +0000 UTC m=+940.697840998" observedRunningTime="2025-10-07 19:15:53.187785195 +0000 UTC m=+942.009823832" watchObservedRunningTime="2025-10-07 19:15:53.192180316 +0000 UTC m=+942.014218953" Oct 07 19:15:53 crc kubenswrapper[4825]: I1007 19:15:53.192444 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-n28d6" event={"ID":"a0f7df98-caae-40a5-bb89-94123bce0763","Type":"ContainerStarted","Data":"36cec46ab1f98269c74d3402d2912c5eae4f227cd02f74bde112847b5f841e7e"} Oct 07 19:15:53 crc kubenswrapper[4825]: I1007 19:15:53.200814 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bs76l" event={"ID":"c720cafe-11e6-4959-8228-b03cdb65242d","Type":"ContainerStarted","Data":"f80333d53ff891e4852b9022e482c81baf8e081166b90e7da9ffbe9ce2c43311"} Oct 07 19:15:53 crc kubenswrapper[4825]: I1007 19:15:53.204135 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v4dk9" event={"ID":"ef71a3c8-e986-4f19-a234-9e9ef7749132","Type":"ContainerStarted","Data":"1f54492dbc8f5cf9153d72263836d92308a34602c74bc2eb74647f99c1f17273"} Oct 07 19:15:53 crc kubenswrapper[4825]: I1007 19:15:53.205249 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-m87x4" event={"ID":"3f63b792-0ed9-453e-8dff-afac52bac339","Type":"ContainerStarted","Data":"a0fce6d978c27248008f10632728b879e0e1fc33f2cc669388c1c55d3b0ab0dd"} Oct 07 19:15:53 crc kubenswrapper[4825]: I1007 19:15:53.221851 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-f9pdx" event={"ID":"1b11f862-ee30-4996-a8fb-218b3c27f07a","Type":"ContainerStarted","Data":"6e2d51a2e733791209a27f1ccb977da31295b975b2adcec169781e1382aa20c8"} Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.231346 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-b9tp8" event={"ID":"282406b3-2501-4b01-adf1-d952fc240404","Type":"ContainerStarted","Data":"28f1724bb00724d72a1ae9c6aba0c33e4ee23f77527b4e4a9ad7522ed5005183"} Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.231731 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-b9tp8" Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.235151 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-ln2cd" event={"ID":"c310d873-fc90-4658-ab06-ffa16a97c784","Type":"ContainerStarted","Data":"2bff403edc5ff582e40f55dc6cd7263b250a5b8a9a6bfc7d1db14202eeefc631"} Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.235322 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-ln2cd" Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.237634 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-mrnwx" event={"ID":"97dc66cd-4313-4951-b85c-dedd5cd2e6ba","Type":"ContainerStarted","Data":"909ba45e9e996158e010467ec8d810dfc8d12c9620ed3869a3c171fcfd015096"} Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.237765 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-mrnwx" Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.240261 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-fpkzs" event={"ID":"01529960-5bd1-4a4d-8703-8d6a3ff38d4b","Type":"ContainerStarted","Data":"24b722dae14ac21191a4ffc5fa12adc5692c5c6d9b7f1bd7476472542227a927"} Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.240409 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-fpkzs" Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.242133 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-9nndm" event={"ID":"9e61b6db-a40e-4ce3-8086-e51bbc6f6295","Type":"ContainerStarted","Data":"b94f3fd3f88199c80d9575774f90c2d0c3d38642bb6fbef94cb48b1a1640e8db"} Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.242283 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-9nndm" Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.248613 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-lg8z7" event={"ID":"bd6d051a-119d-45c5-9b81-939bba328c56","Type":"ContainerStarted","Data":"77face68545f31e34f31ef8a79e41327d0c71a2250cc34c916787fb3bc5e4056"} Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.248689 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-lg8z7" Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.253752 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-n28d6" event={"ID":"a0f7df98-caae-40a5-bb89-94123bce0763","Type":"ContainerStarted","Data":"621b0b3dcc7078c9ab8d55711d38b37f2b50c00f12026d1309fbca693b5587f1"} Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.253903 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-n28d6" Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.255550 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-5mng4" event={"ID":"528dd884-a7df-4574-920f-86ae0d779b62","Type":"ContainerStarted","Data":"70f384bb735d225b3c965bc843f6be3512e6a9e5ed7d3b5bd1a8e4fcff4ddde9"} Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.256091 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-5mng4" Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.257527 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-b9tp8" podStartSLOduration=4.499597266 podStartE2EDuration="16.257499789s" podCreationTimestamp="2025-10-07 19:15:38 +0000 UTC" firstStartedPulling="2025-10-07 19:15:40.173126654 +0000 UTC m=+928.995165291" lastFinishedPulling="2025-10-07 19:15:51.931029167 +0000 UTC m=+940.753067814" observedRunningTime="2025-10-07 19:15:54.249479012 +0000 UTC m=+943.071517659" watchObservedRunningTime="2025-10-07 19:15:54.257499789 +0000 UTC m=+943.079538426" Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.259413 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bs76l" event={"ID":"c720cafe-11e6-4959-8228-b03cdb65242d","Type":"ContainerStarted","Data":"c55587dfa5408e91d9b0db3a19607d02bcde1c5568724c0258e462d7e1d0086e"} Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.259576 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bs76l" Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.265159 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-m87x4" event={"ID":"3f63b792-0ed9-453e-8dff-afac52bac339","Type":"ContainerStarted","Data":"bab351f844beffd812b3280def2152b9afa1899734f1baf239076535d24fd862"} Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.265218 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-649675d675-m87x4" Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.280915 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-fpkzs" podStartSLOduration=4.766136842 podStartE2EDuration="16.280888127s" podCreationTimestamp="2025-10-07 19:15:38 +0000 UTC" firstStartedPulling="2025-10-07 19:15:40.39867122 +0000 UTC m=+929.220709847" lastFinishedPulling="2025-10-07 19:15:51.913422495 +0000 UTC m=+940.735461132" observedRunningTime="2025-10-07 19:15:54.277690025 +0000 UTC m=+943.099728662" watchObservedRunningTime="2025-10-07 19:15:54.280888127 +0000 UTC m=+943.102926764" Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.282051 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-f9pdx" event={"ID":"1b11f862-ee30-4996-a8fb-218b3c27f07a","Type":"ContainerStarted","Data":"18dc07c28dff08fef4d8a0008095624cd7e3ed8dc001392b325767ff6547fd5c"} Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.282540 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-f9pdx" Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.284787 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-9llfs" event={"ID":"844cfe74-a770-4268-a60a-372586ac0744","Type":"ContainerStarted","Data":"fccfb3bd8a654bbc6d309cc401285f9c19f29b35f135bd14e1a4af88f35fe0a1"} Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.285275 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-9llfs" Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.292240 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v4dk9" event={"ID":"ef71a3c8-e986-4f19-a234-9e9ef7749132","Type":"ContainerStarted","Data":"2b9d2a770cb00de51c886dd843c64a771e7f010e7f99c66aee2af0693f3ffd8a"} Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.292295 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v4dk9" Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.311078 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-ln2cd" podStartSLOduration=4.528398516 podStartE2EDuration="16.311056332s" podCreationTimestamp="2025-10-07 19:15:38 +0000 UTC" firstStartedPulling="2025-10-07 19:15:40.093264729 +0000 UTC m=+928.915303376" lastFinishedPulling="2025-10-07 19:15:51.875922555 +0000 UTC m=+940.697961192" observedRunningTime="2025-10-07 19:15:54.307500328 +0000 UTC m=+943.129539055" watchObservedRunningTime="2025-10-07 19:15:54.311056332 +0000 UTC m=+943.133094969" Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.331442 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-mrnwx" podStartSLOduration=4.059175316 podStartE2EDuration="16.331417854s" podCreationTimestamp="2025-10-07 19:15:38 +0000 UTC" firstStartedPulling="2025-10-07 19:15:39.628433638 +0000 UTC m=+928.450472275" lastFinishedPulling="2025-10-07 19:15:51.900676146 +0000 UTC m=+940.722714813" observedRunningTime="2025-10-07 19:15:54.330101271 +0000 UTC m=+943.152139928" watchObservedRunningTime="2025-10-07 19:15:54.331417854 +0000 UTC m=+943.153456491" Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.354466 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-9nndm" podStartSLOduration=4.705500724 podStartE2EDuration="16.354439101s" podCreationTimestamp="2025-10-07 19:15:38 +0000 UTC" firstStartedPulling="2025-10-07 19:15:40.226938506 +0000 UTC m=+929.048977143" lastFinishedPulling="2025-10-07 19:15:51.875876883 +0000 UTC m=+940.697915520" observedRunningTime="2025-10-07 19:15:54.347139547 +0000 UTC m=+943.169178194" watchObservedRunningTime="2025-10-07 19:15:54.354439101 +0000 UTC m=+943.176477738" Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.371774 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-lg8z7" podStartSLOduration=5.028038202 podStartE2EDuration="16.371754764s" podCreationTimestamp="2025-10-07 19:15:38 +0000 UTC" firstStartedPulling="2025-10-07 19:15:40.563297487 +0000 UTC m=+929.385336124" lastFinishedPulling="2025-10-07 19:15:51.907014049 +0000 UTC m=+940.729052686" observedRunningTime="2025-10-07 19:15:54.370699651 +0000 UTC m=+943.192738308" watchObservedRunningTime="2025-10-07 19:15:54.371754764 +0000 UTC m=+943.193793401" Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.391737 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-649675d675-m87x4" podStartSLOduration=4.622512067 podStartE2EDuration="16.391717093s" podCreationTimestamp="2025-10-07 19:15:38 +0000 UTC" firstStartedPulling="2025-10-07 19:15:40.171939246 +0000 UTC m=+928.993977883" lastFinishedPulling="2025-10-07 19:15:51.941144272 +0000 UTC m=+940.763182909" observedRunningTime="2025-10-07 19:15:54.391008231 +0000 UTC m=+943.213046878" watchObservedRunningTime="2025-10-07 19:15:54.391717093 +0000 UTC m=+943.213755730" Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.422909 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-9llfs" podStartSLOduration=4.940678127 podStartE2EDuration="16.42286997s" podCreationTimestamp="2025-10-07 19:15:38 +0000 UTC" firstStartedPulling="2025-10-07 19:15:40.393871876 +0000 UTC m=+929.215910523" lastFinishedPulling="2025-10-07 19:15:51.876063719 +0000 UTC m=+940.698102366" observedRunningTime="2025-10-07 19:15:54.414666437 +0000 UTC m=+943.236705074" watchObservedRunningTime="2025-10-07 19:15:54.42286997 +0000 UTC m=+943.244908617" Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.437845 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-f9pdx" podStartSLOduration=4.698574301 podStartE2EDuration="16.437825048s" podCreationTimestamp="2025-10-07 19:15:38 +0000 UTC" firstStartedPulling="2025-10-07 19:15:40.173466555 +0000 UTC m=+928.995505182" lastFinishedPulling="2025-10-07 19:15:51.912717292 +0000 UTC m=+940.734755929" observedRunningTime="2025-10-07 19:15:54.434535513 +0000 UTC m=+943.256574170" watchObservedRunningTime="2025-10-07 19:15:54.437825048 +0000 UTC m=+943.259863685" Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.460831 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-5mng4" podStartSLOduration=4.769653767 podStartE2EDuration="16.460798184s" podCreationTimestamp="2025-10-07 19:15:38 +0000 UTC" firstStartedPulling="2025-10-07 19:15:40.17767389 +0000 UTC m=+928.999712527" lastFinishedPulling="2025-10-07 19:15:51.868818307 +0000 UTC m=+940.690856944" observedRunningTime="2025-10-07 19:15:54.457355333 +0000 UTC m=+943.279393970" watchObservedRunningTime="2025-10-07 19:15:54.460798184 +0000 UTC m=+943.282836821" Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.481868 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-n28d6" podStartSLOduration=4.770461801 podStartE2EDuration="16.481849807s" podCreationTimestamp="2025-10-07 19:15:38 +0000 UTC" firstStartedPulling="2025-10-07 19:15:40.201965437 +0000 UTC m=+929.024004074" lastFinishedPulling="2025-10-07 19:15:51.913353443 +0000 UTC m=+940.735392080" observedRunningTime="2025-10-07 19:15:54.478199349 +0000 UTC m=+943.300237986" watchObservedRunningTime="2025-10-07 19:15:54.481849807 +0000 UTC m=+943.303888444" Oct 07 19:15:54 crc kubenswrapper[4825]: I1007 19:15:54.507373 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bs76l" podStartSLOduration=5.005304364 podStartE2EDuration="16.507354063s" podCreationTimestamp="2025-10-07 19:15:38 +0000 UTC" firstStartedPulling="2025-10-07 19:15:40.398979449 +0000 UTC m=+929.221018086" lastFinishedPulling="2025-10-07 19:15:51.901029148 +0000 UTC m=+940.723067785" observedRunningTime="2025-10-07 19:15:54.497319302 +0000 UTC m=+943.319357939" watchObservedRunningTime="2025-10-07 19:15:54.507354063 +0000 UTC m=+943.329392690" Oct 07 19:15:58 crc kubenswrapper[4825]: I1007 19:15:58.821031 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-mrnwx" Oct 07 19:15:58 crc kubenswrapper[4825]: I1007 19:15:58.827645 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v4dk9" Oct 07 19:15:58 crc kubenswrapper[4825]: I1007 19:15:58.841497 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v4dk9" podStartSLOduration=8.570162575 podStartE2EDuration="20.841474705s" podCreationTimestamp="2025-10-07 19:15:38 +0000 UTC" firstStartedPulling="2025-10-07 19:15:39.584605595 +0000 UTC m=+928.406644232" lastFinishedPulling="2025-10-07 19:15:51.855917725 +0000 UTC m=+940.677956362" observedRunningTime="2025-10-07 19:15:54.52072789 +0000 UTC m=+943.342766527" watchObservedRunningTime="2025-10-07 19:15:58.841474705 +0000 UTC m=+947.663513342" Oct 07 19:15:58 crc kubenswrapper[4825]: I1007 19:15:58.868937 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-ln2cd" Oct 07 19:15:58 crc kubenswrapper[4825]: I1007 19:15:58.881527 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-n28d6" Oct 07 19:15:58 crc kubenswrapper[4825]: I1007 19:15:58.910559 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-jrh49" Oct 07 19:15:58 crc kubenswrapper[4825]: I1007 19:15:58.944774 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-b9tp8" Oct 07 19:15:59 crc kubenswrapper[4825]: I1007 19:15:59.008882 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-f9pdx" Oct 07 19:15:59 crc kubenswrapper[4825]: I1007 19:15:59.050421 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-649675d675-m87x4" Oct 07 19:15:59 crc kubenswrapper[4825]: I1007 19:15:59.125023 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-5mng4" Oct 07 19:15:59 crc kubenswrapper[4825]: I1007 19:15:59.146715 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-9nndm" Oct 07 19:15:59 crc kubenswrapper[4825]: I1007 19:15:59.218401 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-fpkzs" Oct 07 19:15:59 crc kubenswrapper[4825]: I1007 19:15:59.311504 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-9llfs" Oct 07 19:15:59 crc kubenswrapper[4825]: I1007 19:15:59.464726 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bs76l" Oct 07 19:15:59 crc kubenswrapper[4825]: I1007 19:15:59.527860 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-lg8z7" Oct 07 19:16:00 crc kubenswrapper[4825]: I1007 19:16:00.370038 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm" event={"ID":"22f18be3-b165-4b14-90bd-3eac19ae3fee","Type":"ContainerStarted","Data":"e3a0d78f7809705ec70939050945c52f6cd982f46942f4d752ba8059023d96f7"} Oct 07 19:16:00 crc kubenswrapper[4825]: I1007 19:16:00.371000 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm" Oct 07 19:16:00 crc kubenswrapper[4825]: I1007 19:16:00.373804 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lp8qp" event={"ID":"3b8778b6-81a2-4e3c-b464-6e5c8e063a4b","Type":"ContainerStarted","Data":"df002c9f715ee31d40647ba189008575476a2ef08269fda0efa7f9375c60e1cc"} Oct 07 19:16:00 crc kubenswrapper[4825]: I1007 19:16:00.374722 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lp8qp" Oct 07 19:16:00 crc kubenswrapper[4825]: I1007 19:16:00.390776 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm" podStartSLOduration=5.633691138 podStartE2EDuration="22.390755841s" podCreationTimestamp="2025-10-07 19:15:38 +0000 UTC" firstStartedPulling="2025-10-07 19:15:40.415081175 +0000 UTC m=+929.237119812" lastFinishedPulling="2025-10-07 19:15:57.172145878 +0000 UTC m=+945.994184515" observedRunningTime="2025-10-07 19:16:00.390289627 +0000 UTC m=+949.212328264" watchObservedRunningTime="2025-10-07 19:16:00.390755841 +0000 UTC m=+949.212794478" Oct 07 19:16:01 crc kubenswrapper[4825]: I1007 19:16:01.387308 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv" event={"ID":"8cacb372-6381-4182-92eb-81e607f7cf31","Type":"ContainerStarted","Data":"a66523011bf8a651e080e487390e663fd45258fc31df934fd413f98419dc4e42"} Oct 07 19:16:01 crc kubenswrapper[4825]: I1007 19:16:01.387629 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv" Oct 07 19:16:01 crc kubenswrapper[4825]: I1007 19:16:01.391324 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-zbnmp" event={"ID":"f99d1a15-090e-4a5e-a210-690be64c4742","Type":"ContainerStarted","Data":"1cc9eb4182983b8943a966d7c10d5432fd16223c3cf1819e61e7e01848e93d1d"} Oct 07 19:16:01 crc kubenswrapper[4825]: I1007 19:16:01.391977 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-zbnmp" Oct 07 19:16:01 crc kubenswrapper[4825]: I1007 19:16:01.394680 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vn86z" event={"ID":"0caa8db7-d83d-47bd-9276-29102dd20de8","Type":"ContainerStarted","Data":"900fe4b33eaf83c36273dffb02588a0bcc51c01383580859f529f6a941fb8a44"} Oct 07 19:16:01 crc kubenswrapper[4825]: I1007 19:16:01.395169 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vn86z" Oct 07 19:16:01 crc kubenswrapper[4825]: I1007 19:16:01.397616 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-hjgxc" event={"ID":"0ecb1a32-2936-470c-a9c5-6701d461cd71","Type":"ContainerStarted","Data":"4f17a2e4c229844b2ef4d9190a2588545889a97838a39796e1f6f6d6aa5715e1"} Oct 07 19:16:01 crc kubenswrapper[4825]: I1007 19:16:01.397991 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-hjgxc" Oct 07 19:16:01 crc kubenswrapper[4825]: I1007 19:16:01.399556 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rgt49" event={"ID":"063bceb1-c26d-453a-a74a-e6874c273034","Type":"ContainerStarted","Data":"5d159835717c53efb6b0d9680bc2b5ba36ea85ac022d52976be3d29a7fa1af0d"} Oct 07 19:16:01 crc kubenswrapper[4825]: I1007 19:16:01.402489 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-jvj5c" event={"ID":"7f5bc608-3853-4a58-ac8d-18f57baffe4c","Type":"ContainerStarted","Data":"b8ddaaae966898f598403d49986a4d2850650c4fe1ab0a0a21a931886a45f19a"} Oct 07 19:16:01 crc kubenswrapper[4825]: I1007 19:16:01.402880 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-jvj5c" Oct 07 19:16:01 crc kubenswrapper[4825]: I1007 19:16:01.418314 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lp8qp" podStartSLOduration=6.133785908 podStartE2EDuration="23.418284745s" podCreationTimestamp="2025-10-07 19:15:38 +0000 UTC" firstStartedPulling="2025-10-07 19:15:40.413863206 +0000 UTC m=+929.235901843" lastFinishedPulling="2025-10-07 19:15:57.698362053 +0000 UTC m=+946.520400680" observedRunningTime="2025-10-07 19:16:00.409484201 +0000 UTC m=+949.231522848" watchObservedRunningTime="2025-10-07 19:16:01.418284745 +0000 UTC m=+950.240323382" Oct 07 19:16:01 crc kubenswrapper[4825]: I1007 19:16:01.419883 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv" podStartSLOduration=3.7424110600000002 podStartE2EDuration="23.419878836s" podCreationTimestamp="2025-10-07 19:15:38 +0000 UTC" firstStartedPulling="2025-10-07 19:15:40.646656654 +0000 UTC m=+929.468695291" lastFinishedPulling="2025-10-07 19:16:00.32412443 +0000 UTC m=+949.146163067" observedRunningTime="2025-10-07 19:16:01.410895949 +0000 UTC m=+950.232934576" watchObservedRunningTime="2025-10-07 19:16:01.419878836 +0000 UTC m=+950.241917473" Oct 07 19:16:01 crc kubenswrapper[4825]: I1007 19:16:01.444286 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-jvj5c" podStartSLOduration=5.8996173259999996 podStartE2EDuration="23.444257466s" podCreationTimestamp="2025-10-07 19:15:38 +0000 UTC" firstStartedPulling="2025-10-07 19:15:40.646447337 +0000 UTC m=+929.468485974" lastFinishedPulling="2025-10-07 19:15:58.191087477 +0000 UTC m=+947.013126114" observedRunningTime="2025-10-07 19:16:01.439793814 +0000 UTC m=+950.261832451" watchObservedRunningTime="2025-10-07 19:16:01.444257466 +0000 UTC m=+950.266296133" Oct 07 19:16:01 crc kubenswrapper[4825]: I1007 19:16:01.466288 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vn86z" podStartSLOduration=4.216043132 podStartE2EDuration="23.46626072s" podCreationTimestamp="2025-10-07 19:15:38 +0000 UTC" firstStartedPulling="2025-10-07 19:15:40.406437958 +0000 UTC m=+929.228476595" lastFinishedPulling="2025-10-07 19:15:59.656655546 +0000 UTC m=+948.478694183" observedRunningTime="2025-10-07 19:16:01.462784379 +0000 UTC m=+950.284823016" watchObservedRunningTime="2025-10-07 19:16:01.46626072 +0000 UTC m=+950.288299377" Oct 07 19:16:01 crc kubenswrapper[4825]: I1007 19:16:01.485994 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-hjgxc" podStartSLOduration=3.48769319 podStartE2EDuration="23.48596583s" podCreationTimestamp="2025-10-07 19:15:38 +0000 UTC" firstStartedPulling="2025-10-07 19:15:40.416211391 +0000 UTC m=+929.238250028" lastFinishedPulling="2025-10-07 19:16:00.414484031 +0000 UTC m=+949.236522668" observedRunningTime="2025-10-07 19:16:01.481125706 +0000 UTC m=+950.303164353" watchObservedRunningTime="2025-10-07 19:16:01.48596583 +0000 UTC m=+950.308004477" Oct 07 19:16:01 crc kubenswrapper[4825]: I1007 19:16:01.503949 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rgt49" podStartSLOduration=3.478097947 podStartE2EDuration="22.503923895s" podCreationTimestamp="2025-10-07 19:15:39 +0000 UTC" firstStartedPulling="2025-10-07 19:15:40.592564823 +0000 UTC m=+929.414603460" lastFinishedPulling="2025-10-07 19:15:59.618390781 +0000 UTC m=+948.440429408" observedRunningTime="2025-10-07 19:16:01.498001556 +0000 UTC m=+950.320040203" watchObservedRunningTime="2025-10-07 19:16:01.503923895 +0000 UTC m=+950.325962542" Oct 07 19:16:01 crc kubenswrapper[4825]: I1007 19:16:01.524651 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-zbnmp" podStartSLOduration=3.448543741 podStartE2EDuration="22.524627437s" podCreationTimestamp="2025-10-07 19:15:39 +0000 UTC" firstStartedPulling="2025-10-07 19:15:40.58057376 +0000 UTC m=+929.402612397" lastFinishedPulling="2025-10-07 19:15:59.656657456 +0000 UTC m=+948.478696093" observedRunningTime="2025-10-07 19:16:01.521109435 +0000 UTC m=+950.343148082" watchObservedRunningTime="2025-10-07 19:16:01.524627437 +0000 UTC m=+950.346666084" Oct 07 19:16:05 crc kubenswrapper[4825]: I1007 19:16:05.708763 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:16:05 crc kubenswrapper[4825]: I1007 19:16:05.709314 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:16:09 crc kubenswrapper[4825]: I1007 19:16:09.199866 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm" Oct 07 19:16:09 crc kubenswrapper[4825]: I1007 19:16:09.232372 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vn86z" Oct 07 19:16:09 crc kubenswrapper[4825]: I1007 19:16:09.259285 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-hjgxc" Oct 07 19:16:09 crc kubenswrapper[4825]: I1007 19:16:09.330975 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lp8qp" Oct 07 19:16:09 crc kubenswrapper[4825]: I1007 19:16:09.487052 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-jvj5c" Oct 07 19:16:09 crc kubenswrapper[4825]: I1007 19:16:09.552477 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-zbnmp" Oct 07 19:16:09 crc kubenswrapper[4825]: I1007 19:16:09.889380 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv" Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.394419 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2ldhk"] Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.396422 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2ldhk" Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.404561 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.404612 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.404687 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.404898 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-zlzd5" Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.415148 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2ldhk"] Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.441373 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs5d2\" (UniqueName: \"kubernetes.io/projected/2df21b34-c4dc-4b4a-a8d7-7dd8eece872f-kube-api-access-zs5d2\") pod \"dnsmasq-dns-675f4bcbfc-2ldhk\" (UID: \"2df21b34-c4dc-4b4a-a8d7-7dd8eece872f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2ldhk" Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.441466 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2df21b34-c4dc-4b4a-a8d7-7dd8eece872f-config\") pod \"dnsmasq-dns-675f4bcbfc-2ldhk\" (UID: \"2df21b34-c4dc-4b4a-a8d7-7dd8eece872f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2ldhk" Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.472050 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wbv9c"] Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.473406 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-wbv9c" Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.476611 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.482288 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wbv9c"] Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.542544 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9q7v\" (UniqueName: \"kubernetes.io/projected/0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376-kube-api-access-n9q7v\") pod \"dnsmasq-dns-78dd6ddcc-wbv9c\" (UID: \"0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbv9c" Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.542612 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376-config\") pod \"dnsmasq-dns-78dd6ddcc-wbv9c\" (UID: \"0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbv9c" Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.542642 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs5d2\" (UniqueName: \"kubernetes.io/projected/2df21b34-c4dc-4b4a-a8d7-7dd8eece872f-kube-api-access-zs5d2\") pod \"dnsmasq-dns-675f4bcbfc-2ldhk\" (UID: \"2df21b34-c4dc-4b4a-a8d7-7dd8eece872f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2ldhk" Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.542799 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2df21b34-c4dc-4b4a-a8d7-7dd8eece872f-config\") pod \"dnsmasq-dns-675f4bcbfc-2ldhk\" (UID: \"2df21b34-c4dc-4b4a-a8d7-7dd8eece872f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2ldhk" Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.542838 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-wbv9c\" (UID: \"0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbv9c" Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.544445 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2df21b34-c4dc-4b4a-a8d7-7dd8eece872f-config\") pod \"dnsmasq-dns-675f4bcbfc-2ldhk\" (UID: \"2df21b34-c4dc-4b4a-a8d7-7dd8eece872f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2ldhk" Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.575106 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs5d2\" (UniqueName: \"kubernetes.io/projected/2df21b34-c4dc-4b4a-a8d7-7dd8eece872f-kube-api-access-zs5d2\") pod \"dnsmasq-dns-675f4bcbfc-2ldhk\" (UID: \"2df21b34-c4dc-4b4a-a8d7-7dd8eece872f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2ldhk" Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.644599 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-wbv9c\" (UID: \"0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbv9c" Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.644664 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9q7v\" (UniqueName: \"kubernetes.io/projected/0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376-kube-api-access-n9q7v\") pod \"dnsmasq-dns-78dd6ddcc-wbv9c\" (UID: \"0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbv9c" Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.644710 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376-config\") pod \"dnsmasq-dns-78dd6ddcc-wbv9c\" (UID: \"0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbv9c" Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.645519 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-wbv9c\" (UID: \"0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbv9c" Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.645546 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376-config\") pod \"dnsmasq-dns-78dd6ddcc-wbv9c\" (UID: \"0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbv9c" Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.660847 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9q7v\" (UniqueName: \"kubernetes.io/projected/0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376-kube-api-access-n9q7v\") pod \"dnsmasq-dns-78dd6ddcc-wbv9c\" (UID: \"0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbv9c" Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.724860 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2ldhk" Oct 07 19:16:32 crc kubenswrapper[4825]: I1007 19:16:32.793164 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-wbv9c" Oct 07 19:16:33 crc kubenswrapper[4825]: I1007 19:16:33.241387 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2ldhk"] Oct 07 19:16:33 crc kubenswrapper[4825]: I1007 19:16:33.250751 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 19:16:33 crc kubenswrapper[4825]: I1007 19:16:33.301715 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wbv9c"] Oct 07 19:16:33 crc kubenswrapper[4825]: W1007 19:16:33.306516 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e06dd3b_bb9d_49a5_91ff_fc8a28ea8376.slice/crio-2e344504afc5a2163a5951b1944d3cff77796648b44e43f75a6836f3eb4046c6 WatchSource:0}: Error finding container 2e344504afc5a2163a5951b1944d3cff77796648b44e43f75a6836f3eb4046c6: Status 404 returned error can't find the container with id 2e344504afc5a2163a5951b1944d3cff77796648b44e43f75a6836f3eb4046c6 Oct 07 19:16:33 crc kubenswrapper[4825]: I1007 19:16:33.771419 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-wbv9c" event={"ID":"0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376","Type":"ContainerStarted","Data":"2e344504afc5a2163a5951b1944d3cff77796648b44e43f75a6836f3eb4046c6"} Oct 07 19:16:33 crc kubenswrapper[4825]: I1007 19:16:33.785367 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-2ldhk" event={"ID":"2df21b34-c4dc-4b4a-a8d7-7dd8eece872f","Type":"ContainerStarted","Data":"e1fcfa42b3d1bef822c232ee8b10efcf63734b4499853fffdc952604802196e1"} Oct 07 19:16:35 crc kubenswrapper[4825]: I1007 19:16:35.421449 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2ldhk"] Oct 07 19:16:35 crc kubenswrapper[4825]: I1007 19:16:35.439343 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lc2dg"] Oct 07 19:16:35 crc kubenswrapper[4825]: I1007 19:16:35.440640 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-lc2dg" Oct 07 19:16:35 crc kubenswrapper[4825]: I1007 19:16:35.449072 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lc2dg"] Oct 07 19:16:35 crc kubenswrapper[4825]: I1007 19:16:35.504184 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wskfm\" (UniqueName: \"kubernetes.io/projected/57eca3f0-2ba2-4cee-9317-b2157f602944-kube-api-access-wskfm\") pod \"dnsmasq-dns-666b6646f7-lc2dg\" (UID: \"57eca3f0-2ba2-4cee-9317-b2157f602944\") " pod="openstack/dnsmasq-dns-666b6646f7-lc2dg" Oct 07 19:16:35 crc kubenswrapper[4825]: I1007 19:16:35.504308 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57eca3f0-2ba2-4cee-9317-b2157f602944-config\") pod \"dnsmasq-dns-666b6646f7-lc2dg\" (UID: \"57eca3f0-2ba2-4cee-9317-b2157f602944\") " pod="openstack/dnsmasq-dns-666b6646f7-lc2dg" Oct 07 19:16:35 crc kubenswrapper[4825]: I1007 19:16:35.504338 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57eca3f0-2ba2-4cee-9317-b2157f602944-dns-svc\") pod \"dnsmasq-dns-666b6646f7-lc2dg\" (UID: \"57eca3f0-2ba2-4cee-9317-b2157f602944\") " pod="openstack/dnsmasq-dns-666b6646f7-lc2dg" Oct 07 19:16:35 crc kubenswrapper[4825]: I1007 19:16:35.606714 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wskfm\" (UniqueName: \"kubernetes.io/projected/57eca3f0-2ba2-4cee-9317-b2157f602944-kube-api-access-wskfm\") pod \"dnsmasq-dns-666b6646f7-lc2dg\" (UID: \"57eca3f0-2ba2-4cee-9317-b2157f602944\") " pod="openstack/dnsmasq-dns-666b6646f7-lc2dg" Oct 07 19:16:35 crc kubenswrapper[4825]: I1007 19:16:35.606868 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57eca3f0-2ba2-4cee-9317-b2157f602944-config\") pod \"dnsmasq-dns-666b6646f7-lc2dg\" (UID: \"57eca3f0-2ba2-4cee-9317-b2157f602944\") " pod="openstack/dnsmasq-dns-666b6646f7-lc2dg" Oct 07 19:16:35 crc kubenswrapper[4825]: I1007 19:16:35.606895 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57eca3f0-2ba2-4cee-9317-b2157f602944-dns-svc\") pod \"dnsmasq-dns-666b6646f7-lc2dg\" (UID: \"57eca3f0-2ba2-4cee-9317-b2157f602944\") " pod="openstack/dnsmasq-dns-666b6646f7-lc2dg" Oct 07 19:16:35 crc kubenswrapper[4825]: I1007 19:16:35.607811 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57eca3f0-2ba2-4cee-9317-b2157f602944-dns-svc\") pod \"dnsmasq-dns-666b6646f7-lc2dg\" (UID: \"57eca3f0-2ba2-4cee-9317-b2157f602944\") " pod="openstack/dnsmasq-dns-666b6646f7-lc2dg" Oct 07 19:16:35 crc kubenswrapper[4825]: I1007 19:16:35.608679 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57eca3f0-2ba2-4cee-9317-b2157f602944-config\") pod \"dnsmasq-dns-666b6646f7-lc2dg\" (UID: \"57eca3f0-2ba2-4cee-9317-b2157f602944\") " pod="openstack/dnsmasq-dns-666b6646f7-lc2dg" Oct 07 19:16:35 crc kubenswrapper[4825]: I1007 19:16:35.631865 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wskfm\" (UniqueName: \"kubernetes.io/projected/57eca3f0-2ba2-4cee-9317-b2157f602944-kube-api-access-wskfm\") pod \"dnsmasq-dns-666b6646f7-lc2dg\" (UID: \"57eca3f0-2ba2-4cee-9317-b2157f602944\") " pod="openstack/dnsmasq-dns-666b6646f7-lc2dg" Oct 07 19:16:35 crc kubenswrapper[4825]: I1007 19:16:35.708757 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:16:35 crc kubenswrapper[4825]: I1007 19:16:35.708808 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:16:35 crc kubenswrapper[4825]: I1007 19:16:35.745896 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wbv9c"] Oct 07 19:16:35 crc kubenswrapper[4825]: I1007 19:16:35.784447 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-98zzb"] Oct 07 19:16:35 crc kubenswrapper[4825]: I1007 19:16:35.795641 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-98zzb"] Oct 07 19:16:35 crc kubenswrapper[4825]: I1007 19:16:35.796246 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-98zzb" Oct 07 19:16:35 crc kubenswrapper[4825]: I1007 19:16:35.823701 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-lc2dg" Oct 07 19:16:35 crc kubenswrapper[4825]: I1007 19:16:35.920034 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-98zzb\" (UID: \"f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6\") " pod="openstack/dnsmasq-dns-57d769cc4f-98zzb" Oct 07 19:16:35 crc kubenswrapper[4825]: I1007 19:16:35.920097 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6-config\") pod \"dnsmasq-dns-57d769cc4f-98zzb\" (UID: \"f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6\") " pod="openstack/dnsmasq-dns-57d769cc4f-98zzb" Oct 07 19:16:35 crc kubenswrapper[4825]: I1007 19:16:35.920162 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chrzv\" (UniqueName: \"kubernetes.io/projected/f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6-kube-api-access-chrzv\") pod \"dnsmasq-dns-57d769cc4f-98zzb\" (UID: \"f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6\") " pod="openstack/dnsmasq-dns-57d769cc4f-98zzb" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.021987 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-98zzb\" (UID: \"f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6\") " pod="openstack/dnsmasq-dns-57d769cc4f-98zzb" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.022335 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6-config\") pod \"dnsmasq-dns-57d769cc4f-98zzb\" (UID: \"f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6\") " pod="openstack/dnsmasq-dns-57d769cc4f-98zzb" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.022380 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chrzv\" (UniqueName: \"kubernetes.io/projected/f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6-kube-api-access-chrzv\") pod \"dnsmasq-dns-57d769cc4f-98zzb\" (UID: \"f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6\") " pod="openstack/dnsmasq-dns-57d769cc4f-98zzb" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.023040 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-98zzb\" (UID: \"f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6\") " pod="openstack/dnsmasq-dns-57d769cc4f-98zzb" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.025945 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6-config\") pod \"dnsmasq-dns-57d769cc4f-98zzb\" (UID: \"f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6\") " pod="openstack/dnsmasq-dns-57d769cc4f-98zzb" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.038823 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chrzv\" (UniqueName: \"kubernetes.io/projected/f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6-kube-api-access-chrzv\") pod \"dnsmasq-dns-57d769cc4f-98zzb\" (UID: \"f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6\") " pod="openstack/dnsmasq-dns-57d769cc4f-98zzb" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.127419 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-98zzb" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.592805 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.595480 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.600095 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.600358 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.600385 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.600530 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.600588 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.600675 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.600729 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-jffjz" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.607135 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.739347 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19bd5f67-ab1b-4816-8e44-f792ea626299-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.739389 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19bd5f67-ab1b-4816-8e44-f792ea626299-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.739409 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19bd5f67-ab1b-4816-8e44-f792ea626299-config-data\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.739503 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19bd5f67-ab1b-4816-8e44-f792ea626299-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.739578 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19bd5f67-ab1b-4816-8e44-f792ea626299-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.739607 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19bd5f67-ab1b-4816-8e44-f792ea626299-pod-info\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.739636 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/19bd5f67-ab1b-4816-8e44-f792ea626299-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.739704 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/19bd5f67-ab1b-4816-8e44-f792ea626299-server-conf\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.739890 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.739969 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s4gz\" (UniqueName: \"kubernetes.io/projected/19bd5f67-ab1b-4816-8e44-f792ea626299-kube-api-access-7s4gz\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.740048 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19bd5f67-ab1b-4816-8e44-f792ea626299-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.841345 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.841401 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s4gz\" (UniqueName: \"kubernetes.io/projected/19bd5f67-ab1b-4816-8e44-f792ea626299-kube-api-access-7s4gz\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.841446 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19bd5f67-ab1b-4816-8e44-f792ea626299-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.841477 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19bd5f67-ab1b-4816-8e44-f792ea626299-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.841494 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19bd5f67-ab1b-4816-8e44-f792ea626299-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.841508 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19bd5f67-ab1b-4816-8e44-f792ea626299-config-data\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.841546 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19bd5f67-ab1b-4816-8e44-f792ea626299-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.841570 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19bd5f67-ab1b-4816-8e44-f792ea626299-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.841586 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19bd5f67-ab1b-4816-8e44-f792ea626299-pod-info\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.841607 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/19bd5f67-ab1b-4816-8e44-f792ea626299-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.841636 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/19bd5f67-ab1b-4816-8e44-f792ea626299-server-conf\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.842135 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.842214 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19bd5f67-ab1b-4816-8e44-f792ea626299-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.842352 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19bd5f67-ab1b-4816-8e44-f792ea626299-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.842916 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19bd5f67-ab1b-4816-8e44-f792ea626299-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.843257 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19bd5f67-ab1b-4816-8e44-f792ea626299-config-data\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.843716 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/19bd5f67-ab1b-4816-8e44-f792ea626299-server-conf\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.845951 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19bd5f67-ab1b-4816-8e44-f792ea626299-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.847094 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19bd5f67-ab1b-4816-8e44-f792ea626299-pod-info\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.847790 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/19bd5f67-ab1b-4816-8e44-f792ea626299-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.860650 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19bd5f67-ab1b-4816-8e44-f792ea626299-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.862800 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s4gz\" (UniqueName: \"kubernetes.io/projected/19bd5f67-ab1b-4816-8e44-f792ea626299-kube-api-access-7s4gz\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.868749 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.868921 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.870042 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.871904 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.873144 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.873475 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.873635 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.873804 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.875145 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.875310 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-vvsrv" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.884500 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.922084 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.942624 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.942704 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f43a8cb5-b546-476e-a429-12947216e9b0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.942742 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f43a8cb5-b546-476e-a429-12947216e9b0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.942761 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f43a8cb5-b546-476e-a429-12947216e9b0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.942795 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f43a8cb5-b546-476e-a429-12947216e9b0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.942816 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f43a8cb5-b546-476e-a429-12947216e9b0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.942845 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f43a8cb5-b546-476e-a429-12947216e9b0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.942866 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f43a8cb5-b546-476e-a429-12947216e9b0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.942904 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plwj5\" (UniqueName: \"kubernetes.io/projected/f43a8cb5-b546-476e-a429-12947216e9b0-kube-api-access-plwj5\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.942919 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f43a8cb5-b546-476e-a429-12947216e9b0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:36 crc kubenswrapper[4825]: I1007 19:16:36.942936 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f43a8cb5-b546-476e-a429-12947216e9b0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:37 crc kubenswrapper[4825]: I1007 19:16:37.044915 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plwj5\" (UniqueName: \"kubernetes.io/projected/f43a8cb5-b546-476e-a429-12947216e9b0-kube-api-access-plwj5\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:37 crc kubenswrapper[4825]: I1007 19:16:37.044977 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f43a8cb5-b546-476e-a429-12947216e9b0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:37 crc kubenswrapper[4825]: I1007 19:16:37.045001 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f43a8cb5-b546-476e-a429-12947216e9b0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:37 crc kubenswrapper[4825]: I1007 19:16:37.045046 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:37 crc kubenswrapper[4825]: I1007 19:16:37.045085 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f43a8cb5-b546-476e-a429-12947216e9b0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:37 crc kubenswrapper[4825]: I1007 19:16:37.045131 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f43a8cb5-b546-476e-a429-12947216e9b0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:37 crc kubenswrapper[4825]: I1007 19:16:37.045149 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f43a8cb5-b546-476e-a429-12947216e9b0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:37 crc kubenswrapper[4825]: I1007 19:16:37.045177 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f43a8cb5-b546-476e-a429-12947216e9b0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:37 crc kubenswrapper[4825]: I1007 19:16:37.045213 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f43a8cb5-b546-476e-a429-12947216e9b0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:37 crc kubenswrapper[4825]: I1007 19:16:37.045262 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f43a8cb5-b546-476e-a429-12947216e9b0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:37 crc kubenswrapper[4825]: I1007 19:16:37.045285 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f43a8cb5-b546-476e-a429-12947216e9b0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:37 crc kubenswrapper[4825]: I1007 19:16:37.045675 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:37 crc kubenswrapper[4825]: I1007 19:16:37.045735 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f43a8cb5-b546-476e-a429-12947216e9b0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:37 crc kubenswrapper[4825]: I1007 19:16:37.046093 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f43a8cb5-b546-476e-a429-12947216e9b0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:37 crc kubenswrapper[4825]: I1007 19:16:37.046932 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f43a8cb5-b546-476e-a429-12947216e9b0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:37 crc kubenswrapper[4825]: I1007 19:16:37.047117 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f43a8cb5-b546-476e-a429-12947216e9b0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:37 crc kubenswrapper[4825]: I1007 19:16:37.047771 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f43a8cb5-b546-476e-a429-12947216e9b0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:37 crc kubenswrapper[4825]: I1007 19:16:37.051243 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f43a8cb5-b546-476e-a429-12947216e9b0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:37 crc kubenswrapper[4825]: I1007 19:16:37.052035 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f43a8cb5-b546-476e-a429-12947216e9b0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:37 crc kubenswrapper[4825]: I1007 19:16:37.052803 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f43a8cb5-b546-476e-a429-12947216e9b0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:37 crc kubenswrapper[4825]: I1007 19:16:37.059102 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f43a8cb5-b546-476e-a429-12947216e9b0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:37 crc kubenswrapper[4825]: I1007 19:16:37.062886 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plwj5\" (UniqueName: \"kubernetes.io/projected/f43a8cb5-b546-476e-a429-12947216e9b0-kube-api-access-plwj5\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:37 crc kubenswrapper[4825]: I1007 19:16:37.077219 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:37 crc kubenswrapper[4825]: I1007 19:16:37.228224 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:16:38 crc kubenswrapper[4825]: I1007 19:16:38.858564 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 07 19:16:38 crc kubenswrapper[4825]: I1007 19:16:38.861027 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 07 19:16:38 crc kubenswrapper[4825]: I1007 19:16:38.873136 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 07 19:16:38 crc kubenswrapper[4825]: I1007 19:16:38.873966 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-nnxcj" Oct 07 19:16:38 crc kubenswrapper[4825]: I1007 19:16:38.874510 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 07 19:16:38 crc kubenswrapper[4825]: I1007 19:16:38.876595 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 07 19:16:38 crc kubenswrapper[4825]: I1007 19:16:38.884908 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 07 19:16:38 crc kubenswrapper[4825]: I1007 19:16:38.890114 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 07 19:16:38 crc kubenswrapper[4825]: I1007 19:16:38.902638 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 07 19:16:38 crc kubenswrapper[4825]: I1007 19:16:38.985068 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:38 crc kubenswrapper[4825]: I1007 19:16:38.985121 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/fa0ba0a4-872f-4ebd-8ee1-0e57174648a9-secrets\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:38 crc kubenswrapper[4825]: I1007 19:16:38.985145 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz74x\" (UniqueName: \"kubernetes.io/projected/fa0ba0a4-872f-4ebd-8ee1-0e57174648a9-kube-api-access-hz74x\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:38 crc kubenswrapper[4825]: I1007 19:16:38.985166 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fa0ba0a4-872f-4ebd-8ee1-0e57174648a9-config-data-default\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:38 crc kubenswrapper[4825]: I1007 19:16:38.985193 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa0ba0a4-872f-4ebd-8ee1-0e57174648a9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:38 crc kubenswrapper[4825]: I1007 19:16:38.985210 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa0ba0a4-872f-4ebd-8ee1-0e57174648a9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:38 crc kubenswrapper[4825]: I1007 19:16:38.985236 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0ba0a4-872f-4ebd-8ee1-0e57174648a9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:38 crc kubenswrapper[4825]: I1007 19:16:38.985261 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fa0ba0a4-872f-4ebd-8ee1-0e57174648a9-kolla-config\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:38 crc kubenswrapper[4825]: I1007 19:16:38.985276 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fa0ba0a4-872f-4ebd-8ee1-0e57174648a9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.086882 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.086934 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/fa0ba0a4-872f-4ebd-8ee1-0e57174648a9-secrets\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.086955 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz74x\" (UniqueName: \"kubernetes.io/projected/fa0ba0a4-872f-4ebd-8ee1-0e57174648a9-kube-api-access-hz74x\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.086977 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fa0ba0a4-872f-4ebd-8ee1-0e57174648a9-config-data-default\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.087001 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa0ba0a4-872f-4ebd-8ee1-0e57174648a9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.087019 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa0ba0a4-872f-4ebd-8ee1-0e57174648a9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.087034 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0ba0a4-872f-4ebd-8ee1-0e57174648a9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.087059 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fa0ba0a4-872f-4ebd-8ee1-0e57174648a9-kolla-config\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.087073 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fa0ba0a4-872f-4ebd-8ee1-0e57174648a9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.087492 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fa0ba0a4-872f-4ebd-8ee1-0e57174648a9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.087679 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.089513 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa0ba0a4-872f-4ebd-8ee1-0e57174648a9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.090726 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fa0ba0a4-872f-4ebd-8ee1-0e57174648a9-config-data-default\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.092270 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fa0ba0a4-872f-4ebd-8ee1-0e57174648a9-kolla-config\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.092507 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/fa0ba0a4-872f-4ebd-8ee1-0e57174648a9-secrets\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.093706 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0ba0a4-872f-4ebd-8ee1-0e57174648a9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.096303 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa0ba0a4-872f-4ebd-8ee1-0e57174648a9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.106471 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz74x\" (UniqueName: \"kubernetes.io/projected/fa0ba0a4-872f-4ebd-8ee1-0e57174648a9-kube-api-access-hz74x\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.108742 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9\") " pod="openstack/openstack-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.201682 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.710762 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.712337 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.717412 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.717679 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.719217 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.722318 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.727542 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-bpzrl" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.800770 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6451a8c0-c6b1-4098-846d-24fe8c26d849-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.800928 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5l7h\" (UniqueName: \"kubernetes.io/projected/6451a8c0-c6b1-4098-846d-24fe8c26d849-kube-api-access-c5l7h\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.801003 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6451a8c0-c6b1-4098-846d-24fe8c26d849-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.801051 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6451a8c0-c6b1-4098-846d-24fe8c26d849-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.801093 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.801127 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6451a8c0-c6b1-4098-846d-24fe8c26d849-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.801150 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6451a8c0-c6b1-4098-846d-24fe8c26d849-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.801209 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6451a8c0-c6b1-4098-846d-24fe8c26d849-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.801243 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/6451a8c0-c6b1-4098-846d-24fe8c26d849-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.903910 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6451a8c0-c6b1-4098-846d-24fe8c26d849-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.903975 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/6451a8c0-c6b1-4098-846d-24fe8c26d849-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.904088 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6451a8c0-c6b1-4098-846d-24fe8c26d849-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.904217 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5l7h\" (UniqueName: \"kubernetes.io/projected/6451a8c0-c6b1-4098-846d-24fe8c26d849-kube-api-access-c5l7h\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.904281 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6451a8c0-c6b1-4098-846d-24fe8c26d849-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.904335 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6451a8c0-c6b1-4098-846d-24fe8c26d849-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.904383 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.904419 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6451a8c0-c6b1-4098-846d-24fe8c26d849-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.904453 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6451a8c0-c6b1-4098-846d-24fe8c26d849-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.905203 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6451a8c0-c6b1-4098-846d-24fe8c26d849-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.905419 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.905467 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6451a8c0-c6b1-4098-846d-24fe8c26d849-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.905752 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6451a8c0-c6b1-4098-846d-24fe8c26d849-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.906892 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6451a8c0-c6b1-4098-846d-24fe8c26d849-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.911641 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6451a8c0-c6b1-4098-846d-24fe8c26d849-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.913780 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6451a8c0-c6b1-4098-846d-24fe8c26d849-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.916927 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/6451a8c0-c6b1-4098-846d-24fe8c26d849-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.935127 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5l7h\" (UniqueName: \"kubernetes.io/projected/6451a8c0-c6b1-4098-846d-24fe8c26d849-kube-api-access-c5l7h\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:39 crc kubenswrapper[4825]: I1007 19:16:39.939924 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6451a8c0-c6b1-4098-846d-24fe8c26d849\") " pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:40 crc kubenswrapper[4825]: I1007 19:16:40.035469 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 07 19:16:40 crc kubenswrapper[4825]: I1007 19:16:40.248868 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 07 19:16:40 crc kubenswrapper[4825]: I1007 19:16:40.250132 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 07 19:16:40 crc kubenswrapper[4825]: I1007 19:16:40.252438 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 07 19:16:40 crc kubenswrapper[4825]: I1007 19:16:40.252729 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-qvs74" Oct 07 19:16:40 crc kubenswrapper[4825]: I1007 19:16:40.256254 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 07 19:16:40 crc kubenswrapper[4825]: I1007 19:16:40.258409 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 07 19:16:40 crc kubenswrapper[4825]: I1007 19:16:40.310514 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8448c74b-bea3-42c0-95da-ab251a90ca9f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8448c74b-bea3-42c0-95da-ab251a90ca9f\") " pod="openstack/memcached-0" Oct 07 19:16:40 crc kubenswrapper[4825]: I1007 19:16:40.310567 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9xlz\" (UniqueName: \"kubernetes.io/projected/8448c74b-bea3-42c0-95da-ab251a90ca9f-kube-api-access-c9xlz\") pod \"memcached-0\" (UID: \"8448c74b-bea3-42c0-95da-ab251a90ca9f\") " pod="openstack/memcached-0" Oct 07 19:16:40 crc kubenswrapper[4825]: I1007 19:16:40.310700 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8448c74b-bea3-42c0-95da-ab251a90ca9f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8448c74b-bea3-42c0-95da-ab251a90ca9f\") " pod="openstack/memcached-0" Oct 07 19:16:40 crc kubenswrapper[4825]: I1007 19:16:40.310746 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8448c74b-bea3-42c0-95da-ab251a90ca9f-kolla-config\") pod \"memcached-0\" (UID: \"8448c74b-bea3-42c0-95da-ab251a90ca9f\") " pod="openstack/memcached-0" Oct 07 19:16:40 crc kubenswrapper[4825]: I1007 19:16:40.311595 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8448c74b-bea3-42c0-95da-ab251a90ca9f-config-data\") pod \"memcached-0\" (UID: \"8448c74b-bea3-42c0-95da-ab251a90ca9f\") " pod="openstack/memcached-0" Oct 07 19:16:40 crc kubenswrapper[4825]: I1007 19:16:40.413015 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8448c74b-bea3-42c0-95da-ab251a90ca9f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8448c74b-bea3-42c0-95da-ab251a90ca9f\") " pod="openstack/memcached-0" Oct 07 19:16:40 crc kubenswrapper[4825]: I1007 19:16:40.413061 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9xlz\" (UniqueName: \"kubernetes.io/projected/8448c74b-bea3-42c0-95da-ab251a90ca9f-kube-api-access-c9xlz\") pod \"memcached-0\" (UID: \"8448c74b-bea3-42c0-95da-ab251a90ca9f\") " pod="openstack/memcached-0" Oct 07 19:16:40 crc kubenswrapper[4825]: I1007 19:16:40.413105 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8448c74b-bea3-42c0-95da-ab251a90ca9f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8448c74b-bea3-42c0-95da-ab251a90ca9f\") " pod="openstack/memcached-0" Oct 07 19:16:40 crc kubenswrapper[4825]: I1007 19:16:40.413129 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8448c74b-bea3-42c0-95da-ab251a90ca9f-kolla-config\") pod \"memcached-0\" (UID: \"8448c74b-bea3-42c0-95da-ab251a90ca9f\") " pod="openstack/memcached-0" Oct 07 19:16:40 crc kubenswrapper[4825]: I1007 19:16:40.413199 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8448c74b-bea3-42c0-95da-ab251a90ca9f-config-data\") pod \"memcached-0\" (UID: \"8448c74b-bea3-42c0-95da-ab251a90ca9f\") " pod="openstack/memcached-0" Oct 07 19:16:40 crc kubenswrapper[4825]: I1007 19:16:40.414412 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8448c74b-bea3-42c0-95da-ab251a90ca9f-config-data\") pod \"memcached-0\" (UID: \"8448c74b-bea3-42c0-95da-ab251a90ca9f\") " pod="openstack/memcached-0" Oct 07 19:16:40 crc kubenswrapper[4825]: I1007 19:16:40.414899 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8448c74b-bea3-42c0-95da-ab251a90ca9f-kolla-config\") pod \"memcached-0\" (UID: \"8448c74b-bea3-42c0-95da-ab251a90ca9f\") " pod="openstack/memcached-0" Oct 07 19:16:40 crc kubenswrapper[4825]: I1007 19:16:40.419093 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8448c74b-bea3-42c0-95da-ab251a90ca9f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8448c74b-bea3-42c0-95da-ab251a90ca9f\") " pod="openstack/memcached-0" Oct 07 19:16:40 crc kubenswrapper[4825]: I1007 19:16:40.424985 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8448c74b-bea3-42c0-95da-ab251a90ca9f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8448c74b-bea3-42c0-95da-ab251a90ca9f\") " pod="openstack/memcached-0" Oct 07 19:16:40 crc kubenswrapper[4825]: I1007 19:16:40.443369 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9xlz\" (UniqueName: \"kubernetes.io/projected/8448c74b-bea3-42c0-95da-ab251a90ca9f-kube-api-access-c9xlz\") pod \"memcached-0\" (UID: \"8448c74b-bea3-42c0-95da-ab251a90ca9f\") " pod="openstack/memcached-0" Oct 07 19:16:40 crc kubenswrapper[4825]: I1007 19:16:40.566887 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 07 19:16:42 crc kubenswrapper[4825]: I1007 19:16:42.139939 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 19:16:42 crc kubenswrapper[4825]: I1007 19:16:42.141118 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 19:16:42 crc kubenswrapper[4825]: I1007 19:16:42.142963 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-xs8bc" Oct 07 19:16:42 crc kubenswrapper[4825]: I1007 19:16:42.151201 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 19:16:42 crc kubenswrapper[4825]: I1007 19:16:42.242233 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h2s5\" (UniqueName: \"kubernetes.io/projected/66c0c344-091c-42cf-bfbb-bbdc83a37bce-kube-api-access-8h2s5\") pod \"kube-state-metrics-0\" (UID: \"66c0c344-091c-42cf-bfbb-bbdc83a37bce\") " pod="openstack/kube-state-metrics-0" Oct 07 19:16:42 crc kubenswrapper[4825]: I1007 19:16:42.343260 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h2s5\" (UniqueName: \"kubernetes.io/projected/66c0c344-091c-42cf-bfbb-bbdc83a37bce-kube-api-access-8h2s5\") pod \"kube-state-metrics-0\" (UID: \"66c0c344-091c-42cf-bfbb-bbdc83a37bce\") " pod="openstack/kube-state-metrics-0" Oct 07 19:16:42 crc kubenswrapper[4825]: I1007 19:16:42.365485 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h2s5\" (UniqueName: \"kubernetes.io/projected/66c0c344-091c-42cf-bfbb-bbdc83a37bce-kube-api-access-8h2s5\") pod \"kube-state-metrics-0\" (UID: \"66c0c344-091c-42cf-bfbb-bbdc83a37bce\") " pod="openstack/kube-state-metrics-0" Oct 07 19:16:42 crc kubenswrapper[4825]: I1007 19:16:42.459123 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.536980 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mqtlv"] Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.538576 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mqtlv" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.550044 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-zw4mb" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.550339 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.550549 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.559850 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-9zcg2"] Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.562010 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9zcg2" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.593036 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mqtlv"] Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.606218 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9zcg2"] Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.614971 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0392f085-cd23-439c-b8aa-e3c94fc320b8-combined-ca-bundle\") pod \"ovn-controller-mqtlv\" (UID: \"0392f085-cd23-439c-b8aa-e3c94fc320b8\") " pod="openstack/ovn-controller-mqtlv" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.615028 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/16ff2637-d49f-4b3b-b3f4-b731b51e8875-var-lib\") pod \"ovn-controller-ovs-9zcg2\" (UID: \"16ff2637-d49f-4b3b-b3f4-b731b51e8875\") " pod="openstack/ovn-controller-ovs-9zcg2" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.615193 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0392f085-cd23-439c-b8aa-e3c94fc320b8-var-log-ovn\") pod \"ovn-controller-mqtlv\" (UID: \"0392f085-cd23-439c-b8aa-e3c94fc320b8\") " pod="openstack/ovn-controller-mqtlv" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.615225 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rkt2\" (UniqueName: \"kubernetes.io/projected/16ff2637-d49f-4b3b-b3f4-b731b51e8875-kube-api-access-7rkt2\") pod \"ovn-controller-ovs-9zcg2\" (UID: \"16ff2637-d49f-4b3b-b3f4-b731b51e8875\") " pod="openstack/ovn-controller-ovs-9zcg2" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.615357 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0392f085-cd23-439c-b8aa-e3c94fc320b8-scripts\") pod \"ovn-controller-mqtlv\" (UID: \"0392f085-cd23-439c-b8aa-e3c94fc320b8\") " pod="openstack/ovn-controller-mqtlv" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.615415 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/16ff2637-d49f-4b3b-b3f4-b731b51e8875-etc-ovs\") pod \"ovn-controller-ovs-9zcg2\" (UID: \"16ff2637-d49f-4b3b-b3f4-b731b51e8875\") " pod="openstack/ovn-controller-ovs-9zcg2" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.615464 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7fvb\" (UniqueName: \"kubernetes.io/projected/0392f085-cd23-439c-b8aa-e3c94fc320b8-kube-api-access-s7fvb\") pod \"ovn-controller-mqtlv\" (UID: \"0392f085-cd23-439c-b8aa-e3c94fc320b8\") " pod="openstack/ovn-controller-mqtlv" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.615537 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0392f085-cd23-439c-b8aa-e3c94fc320b8-ovn-controller-tls-certs\") pod \"ovn-controller-mqtlv\" (UID: \"0392f085-cd23-439c-b8aa-e3c94fc320b8\") " pod="openstack/ovn-controller-mqtlv" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.615567 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/16ff2637-d49f-4b3b-b3f4-b731b51e8875-var-run\") pod \"ovn-controller-ovs-9zcg2\" (UID: \"16ff2637-d49f-4b3b-b3f4-b731b51e8875\") " pod="openstack/ovn-controller-ovs-9zcg2" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.615605 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0392f085-cd23-439c-b8aa-e3c94fc320b8-var-run-ovn\") pod \"ovn-controller-mqtlv\" (UID: \"0392f085-cd23-439c-b8aa-e3c94fc320b8\") " pod="openstack/ovn-controller-mqtlv" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.615626 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/16ff2637-d49f-4b3b-b3f4-b731b51e8875-scripts\") pod \"ovn-controller-ovs-9zcg2\" (UID: \"16ff2637-d49f-4b3b-b3f4-b731b51e8875\") " pod="openstack/ovn-controller-ovs-9zcg2" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.615672 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/16ff2637-d49f-4b3b-b3f4-b731b51e8875-var-log\") pod \"ovn-controller-ovs-9zcg2\" (UID: \"16ff2637-d49f-4b3b-b3f4-b731b51e8875\") " pod="openstack/ovn-controller-ovs-9zcg2" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.615764 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0392f085-cd23-439c-b8aa-e3c94fc320b8-var-run\") pod \"ovn-controller-mqtlv\" (UID: \"0392f085-cd23-439c-b8aa-e3c94fc320b8\") " pod="openstack/ovn-controller-mqtlv" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.717340 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0392f085-cd23-439c-b8aa-e3c94fc320b8-var-run\") pod \"ovn-controller-mqtlv\" (UID: \"0392f085-cd23-439c-b8aa-e3c94fc320b8\") " pod="openstack/ovn-controller-mqtlv" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.717419 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0392f085-cd23-439c-b8aa-e3c94fc320b8-combined-ca-bundle\") pod \"ovn-controller-mqtlv\" (UID: \"0392f085-cd23-439c-b8aa-e3c94fc320b8\") " pod="openstack/ovn-controller-mqtlv" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.717459 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/16ff2637-d49f-4b3b-b3f4-b731b51e8875-var-lib\") pod \"ovn-controller-ovs-9zcg2\" (UID: \"16ff2637-d49f-4b3b-b3f4-b731b51e8875\") " pod="openstack/ovn-controller-ovs-9zcg2" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.717481 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0392f085-cd23-439c-b8aa-e3c94fc320b8-var-log-ovn\") pod \"ovn-controller-mqtlv\" (UID: \"0392f085-cd23-439c-b8aa-e3c94fc320b8\") " pod="openstack/ovn-controller-mqtlv" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.717513 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rkt2\" (UniqueName: \"kubernetes.io/projected/16ff2637-d49f-4b3b-b3f4-b731b51e8875-kube-api-access-7rkt2\") pod \"ovn-controller-ovs-9zcg2\" (UID: \"16ff2637-d49f-4b3b-b3f4-b731b51e8875\") " pod="openstack/ovn-controller-ovs-9zcg2" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.717537 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0392f085-cd23-439c-b8aa-e3c94fc320b8-scripts\") pod \"ovn-controller-mqtlv\" (UID: \"0392f085-cd23-439c-b8aa-e3c94fc320b8\") " pod="openstack/ovn-controller-mqtlv" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.717563 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/16ff2637-d49f-4b3b-b3f4-b731b51e8875-etc-ovs\") pod \"ovn-controller-ovs-9zcg2\" (UID: \"16ff2637-d49f-4b3b-b3f4-b731b51e8875\") " pod="openstack/ovn-controller-ovs-9zcg2" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.717598 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7fvb\" (UniqueName: \"kubernetes.io/projected/0392f085-cd23-439c-b8aa-e3c94fc320b8-kube-api-access-s7fvb\") pod \"ovn-controller-mqtlv\" (UID: \"0392f085-cd23-439c-b8aa-e3c94fc320b8\") " pod="openstack/ovn-controller-mqtlv" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.717644 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0392f085-cd23-439c-b8aa-e3c94fc320b8-ovn-controller-tls-certs\") pod \"ovn-controller-mqtlv\" (UID: \"0392f085-cd23-439c-b8aa-e3c94fc320b8\") " pod="openstack/ovn-controller-mqtlv" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.717701 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/16ff2637-d49f-4b3b-b3f4-b731b51e8875-var-run\") pod \"ovn-controller-ovs-9zcg2\" (UID: \"16ff2637-d49f-4b3b-b3f4-b731b51e8875\") " pod="openstack/ovn-controller-ovs-9zcg2" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.717730 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0392f085-cd23-439c-b8aa-e3c94fc320b8-var-run-ovn\") pod \"ovn-controller-mqtlv\" (UID: \"0392f085-cd23-439c-b8aa-e3c94fc320b8\") " pod="openstack/ovn-controller-mqtlv" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.717753 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/16ff2637-d49f-4b3b-b3f4-b731b51e8875-scripts\") pod \"ovn-controller-ovs-9zcg2\" (UID: \"16ff2637-d49f-4b3b-b3f4-b731b51e8875\") " pod="openstack/ovn-controller-ovs-9zcg2" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.717786 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/16ff2637-d49f-4b3b-b3f4-b731b51e8875-var-log\") pod \"ovn-controller-ovs-9zcg2\" (UID: \"16ff2637-d49f-4b3b-b3f4-b731b51e8875\") " pod="openstack/ovn-controller-ovs-9zcg2" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.718144 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0392f085-cd23-439c-b8aa-e3c94fc320b8-var-run\") pod \"ovn-controller-mqtlv\" (UID: \"0392f085-cd23-439c-b8aa-e3c94fc320b8\") " pod="openstack/ovn-controller-mqtlv" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.718257 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/16ff2637-d49f-4b3b-b3f4-b731b51e8875-var-log\") pod \"ovn-controller-ovs-9zcg2\" (UID: \"16ff2637-d49f-4b3b-b3f4-b731b51e8875\") " pod="openstack/ovn-controller-ovs-9zcg2" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.718578 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0392f085-cd23-439c-b8aa-e3c94fc320b8-var-run-ovn\") pod \"ovn-controller-mqtlv\" (UID: \"0392f085-cd23-439c-b8aa-e3c94fc320b8\") " pod="openstack/ovn-controller-mqtlv" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.718640 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/16ff2637-d49f-4b3b-b3f4-b731b51e8875-var-run\") pod \"ovn-controller-ovs-9zcg2\" (UID: \"16ff2637-d49f-4b3b-b3f4-b731b51e8875\") " pod="openstack/ovn-controller-ovs-9zcg2" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.719332 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/16ff2637-d49f-4b3b-b3f4-b731b51e8875-var-lib\") pod \"ovn-controller-ovs-9zcg2\" (UID: \"16ff2637-d49f-4b3b-b3f4-b731b51e8875\") " pod="openstack/ovn-controller-ovs-9zcg2" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.719478 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0392f085-cd23-439c-b8aa-e3c94fc320b8-var-log-ovn\") pod \"ovn-controller-mqtlv\" (UID: \"0392f085-cd23-439c-b8aa-e3c94fc320b8\") " pod="openstack/ovn-controller-mqtlv" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.719570 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/16ff2637-d49f-4b3b-b3f4-b731b51e8875-etc-ovs\") pod \"ovn-controller-ovs-9zcg2\" (UID: \"16ff2637-d49f-4b3b-b3f4-b731b51e8875\") " pod="openstack/ovn-controller-ovs-9zcg2" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.721850 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/16ff2637-d49f-4b3b-b3f4-b731b51e8875-scripts\") pod \"ovn-controller-ovs-9zcg2\" (UID: \"16ff2637-d49f-4b3b-b3f4-b731b51e8875\") " pod="openstack/ovn-controller-ovs-9zcg2" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.721954 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0392f085-cd23-439c-b8aa-e3c94fc320b8-scripts\") pod \"ovn-controller-mqtlv\" (UID: \"0392f085-cd23-439c-b8aa-e3c94fc320b8\") " pod="openstack/ovn-controller-mqtlv" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.726341 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0392f085-cd23-439c-b8aa-e3c94fc320b8-ovn-controller-tls-certs\") pod \"ovn-controller-mqtlv\" (UID: \"0392f085-cd23-439c-b8aa-e3c94fc320b8\") " pod="openstack/ovn-controller-mqtlv" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.726385 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0392f085-cd23-439c-b8aa-e3c94fc320b8-combined-ca-bundle\") pod \"ovn-controller-mqtlv\" (UID: \"0392f085-cd23-439c-b8aa-e3c94fc320b8\") " pod="openstack/ovn-controller-mqtlv" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.734644 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rkt2\" (UniqueName: \"kubernetes.io/projected/16ff2637-d49f-4b3b-b3f4-b731b51e8875-kube-api-access-7rkt2\") pod \"ovn-controller-ovs-9zcg2\" (UID: \"16ff2637-d49f-4b3b-b3f4-b731b51e8875\") " pod="openstack/ovn-controller-ovs-9zcg2" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.743763 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7fvb\" (UniqueName: \"kubernetes.io/projected/0392f085-cd23-439c-b8aa-e3c94fc320b8-kube-api-access-s7fvb\") pod \"ovn-controller-mqtlv\" (UID: \"0392f085-cd23-439c-b8aa-e3c94fc320b8\") " pod="openstack/ovn-controller-mqtlv" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.897807 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mqtlv" Oct 07 19:16:45 crc kubenswrapper[4825]: I1007 19:16:45.909711 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9zcg2" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.527264 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.532314 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.538110 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-c5nz5" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.538323 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.539163 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.539340 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.539489 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.542176 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.642429 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/96ff0bc5-e277-4f6a-a3b3-815e01ac42b7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") " pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.642478 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96ff0bc5-e277-4f6a-a3b3-815e01ac42b7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") " pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.642650 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96ff0bc5-e277-4f6a-a3b3-815e01ac42b7-config\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") " pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.642756 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbj65\" (UniqueName: \"kubernetes.io/projected/96ff0bc5-e277-4f6a-a3b3-815e01ac42b7-kube-api-access-rbj65\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") " pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.642919 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/96ff0bc5-e277-4f6a-a3b3-815e01ac42b7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") " pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.643013 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") " pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.643053 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/96ff0bc5-e277-4f6a-a3b3-815e01ac42b7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") " pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.643075 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ff0bc5-e277-4f6a-a3b3-815e01ac42b7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") " pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.744306 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbj65\" (UniqueName: \"kubernetes.io/projected/96ff0bc5-e277-4f6a-a3b3-815e01ac42b7-kube-api-access-rbj65\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") " pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.744407 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/96ff0bc5-e277-4f6a-a3b3-815e01ac42b7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") " pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.744459 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") " pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.744488 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/96ff0bc5-e277-4f6a-a3b3-815e01ac42b7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") " pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.744517 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ff0bc5-e277-4f6a-a3b3-815e01ac42b7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") " pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.744560 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/96ff0bc5-e277-4f6a-a3b3-815e01ac42b7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") " pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.744583 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96ff0bc5-e277-4f6a-a3b3-815e01ac42b7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") " pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.744626 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96ff0bc5-e277-4f6a-a3b3-815e01ac42b7-config\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") " pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.744864 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.744942 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/96ff0bc5-e277-4f6a-a3b3-815e01ac42b7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") " pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.745901 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96ff0bc5-e277-4f6a-a3b3-815e01ac42b7-config\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") " pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.747095 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96ff0bc5-e277-4f6a-a3b3-815e01ac42b7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") " pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.749971 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/96ff0bc5-e277-4f6a-a3b3-815e01ac42b7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") " pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.750590 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/96ff0bc5-e277-4f6a-a3b3-815e01ac42b7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") " pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.752531 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ff0bc5-e277-4f6a-a3b3-815e01ac42b7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") " pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.767858 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbj65\" (UniqueName: \"kubernetes.io/projected/96ff0bc5-e277-4f6a-a3b3-815e01ac42b7-kube-api-access-rbj65\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") " pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.802305 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7\") " pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:46 crc kubenswrapper[4825]: I1007 19:16:46.878214 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 07 19:16:48 crc kubenswrapper[4825]: E1007 19:16:48.923166 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 07 19:16:48 crc kubenswrapper[4825]: E1007 19:16:48.924309 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n9q7v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-wbv9c_openstack(0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 19:16:48 crc kubenswrapper[4825]: E1007 19:16:48.925580 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-wbv9c" podUID="0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376" Oct 07 19:16:48 crc kubenswrapper[4825]: E1007 19:16:48.963465 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 07 19:16:48 crc kubenswrapper[4825]: E1007 19:16:48.963614 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zs5d2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-2ldhk_openstack(2df21b34-c4dc-4b4a-a8d7-7dd8eece872f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 19:16:48 crc kubenswrapper[4825]: E1007 19:16:48.965903 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-2ldhk" podUID="2df21b34-c4dc-4b4a-a8d7-7dd8eece872f" Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.568720 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.585926 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lc2dg"] Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.601792 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-98zzb"] Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.625222 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.631511 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.636883 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.775487 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 19:16:49 crc kubenswrapper[4825]: W1007 19:16:49.778413 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6451a8c0_c6b1_4098_846d_24fe8c26d849.slice/crio-a164996de8b676192505cbccfaf02d8f1f5ee315ac213805712b214a2b726393 WatchSource:0}: Error finding container a164996de8b676192505cbccfaf02d8f1f5ee315ac213805712b214a2b726393: Status 404 returned error can't find the container with id a164996de8b676192505cbccfaf02d8f1f5ee315ac213805712b214a2b726393 Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.833582 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.841582 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.847893 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.850980 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.857614 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.857980 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.863034 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-44mx6" Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.882926 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.898383 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mqtlv"] Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.957193 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f43a8cb5-b546-476e-a429-12947216e9b0","Type":"ContainerStarted","Data":"f5cbeb286174014d6baa852120d099182efb2b4a6bee1dd2828dc77645692273"} Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.959160 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"66c0c344-091c-42cf-bfbb-bbdc83a37bce","Type":"ContainerStarted","Data":"74d425e7f70445a3dd181fd6150a288a7da1df26ac7a1c2d2185b83d9bc3e091"} Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.964149 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"19bd5f67-ab1b-4816-8e44-f792ea626299","Type":"ContainerStarted","Data":"11d18c93ae0a82bb132f609a3f87567fb3a8a2563fb9570d15f100728dc8a27e"} Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.968106 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mqtlv" event={"ID":"0392f085-cd23-439c-b8aa-e3c94fc320b8","Type":"ContainerStarted","Data":"d1431865bf2b2bc6da848013afc5f5f20153e669eb376e3c4de2c522d6badcfc"} Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.969554 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9","Type":"ContainerStarted","Data":"2775aaada142a7f4a2c1ccbb6061170bb29198a2276d3828a52cfdb2b39c37ed"} Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.970808 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6451a8c0-c6b1-4098-846d-24fe8c26d849","Type":"ContainerStarted","Data":"a164996de8b676192505cbccfaf02d8f1f5ee315ac213805712b214a2b726393"} Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.971661 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-lc2dg" event={"ID":"57eca3f0-2ba2-4cee-9317-b2157f602944","Type":"ContainerStarted","Data":"fa7e9189dc6ecce8326a41c90a2e29d7f7ae351ad487d1f2838ecd878845f033"} Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.973082 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"8448c74b-bea3-42c0-95da-ab251a90ca9f","Type":"ContainerStarted","Data":"eccb5b2f361e7581e0364d0882622846ee02c4bfce0e8f39e23c037405dfec95"} Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.978097 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-98zzb" event={"ID":"f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6","Type":"ContainerStarted","Data":"52467f1ec6e75be5ba6c3572e53785c3b8530b99a2ffb5647b45c4c72bc1fd45"} Oct 07 19:16:49 crc kubenswrapper[4825]: I1007 19:16:49.984049 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 19:16:49 crc kubenswrapper[4825]: W1007 19:16:49.988512 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96ff0bc5_e277_4f6a_a3b3_815e01ac42b7.slice/crio-e84a94ad9112f8e396ee49ff0e61b07f927af22c97a875055b74476a677006a9 WatchSource:0}: Error finding container e84a94ad9112f8e396ee49ff0e61b07f927af22c97a875055b74476a677006a9: Status 404 returned error can't find the container with id e84a94ad9112f8e396ee49ff0e61b07f927af22c97a875055b74476a677006a9 Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.015728 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf48b556-d051-49b5-b9fb-fa6b325e0f79-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") " pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.015795 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4hps\" (UniqueName: \"kubernetes.io/projected/bf48b556-d051-49b5-b9fb-fa6b325e0f79-kube-api-access-l4hps\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") " pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.015826 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") " pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.016471 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf48b556-d051-49b5-b9fb-fa6b325e0f79-config\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") " pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.017700 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf48b556-d051-49b5-b9fb-fa6b325e0f79-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") " pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.017829 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf48b556-d051-49b5-b9fb-fa6b325e0f79-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") " pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.017862 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf48b556-d051-49b5-b9fb-fa6b325e0f79-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") " pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.018106 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf48b556-d051-49b5-b9fb-fa6b325e0f79-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") " pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.083062 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9zcg2"] Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.119171 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf48b556-d051-49b5-b9fb-fa6b325e0f79-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") " pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.119205 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf48b556-d051-49b5-b9fb-fa6b325e0f79-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") " pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.119244 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf48b556-d051-49b5-b9fb-fa6b325e0f79-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") " pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.119282 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf48b556-d051-49b5-b9fb-fa6b325e0f79-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") " pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.119305 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") " pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.119321 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4hps\" (UniqueName: \"kubernetes.io/projected/bf48b556-d051-49b5-b9fb-fa6b325e0f79-kube-api-access-l4hps\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") " pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.119360 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf48b556-d051-49b5-b9fb-fa6b325e0f79-config\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") " pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.119374 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf48b556-d051-49b5-b9fb-fa6b325e0f79-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") " pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.119962 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.120135 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf48b556-d051-49b5-b9fb-fa6b325e0f79-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") " pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.120787 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf48b556-d051-49b5-b9fb-fa6b325e0f79-config\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") " pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.120947 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf48b556-d051-49b5-b9fb-fa6b325e0f79-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") " pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.125132 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf48b556-d051-49b5-b9fb-fa6b325e0f79-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") " pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.126010 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf48b556-d051-49b5-b9fb-fa6b325e0f79-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") " pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.126198 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf48b556-d051-49b5-b9fb-fa6b325e0f79-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") " pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.143001 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") " pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.144794 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4hps\" (UniqueName: \"kubernetes.io/projected/bf48b556-d051-49b5-b9fb-fa6b325e0f79-kube-api-access-l4hps\") pod \"ovsdbserver-sb-0\" (UID: \"bf48b556-d051-49b5-b9fb-fa6b325e0f79\") " pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.175960 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.247272 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-wbv9c" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.426820 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376-dns-svc\") pod \"0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376\" (UID: \"0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376\") " Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.427218 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9q7v\" (UniqueName: \"kubernetes.io/projected/0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376-kube-api-access-n9q7v\") pod \"0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376\" (UID: \"0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376\") " Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.427358 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376-config\") pod \"0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376\" (UID: \"0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376\") " Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.427898 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376" (UID: "0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.428535 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.429717 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376-config" (OuterVolumeSpecName: "config") pod "0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376" (UID: "0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.431842 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376-kube-api-access-n9q7v" (OuterVolumeSpecName: "kube-api-access-n9q7v") pod "0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376" (UID: "0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376"). InnerVolumeSpecName "kube-api-access-n9q7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.529816 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9q7v\" (UniqueName: \"kubernetes.io/projected/0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376-kube-api-access-n9q7v\") on node \"crc\" DevicePath \"\"" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.529848 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.547014 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2ldhk" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.637921 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs5d2\" (UniqueName: \"kubernetes.io/projected/2df21b34-c4dc-4b4a-a8d7-7dd8eece872f-kube-api-access-zs5d2\") pod \"2df21b34-c4dc-4b4a-a8d7-7dd8eece872f\" (UID: \"2df21b34-c4dc-4b4a-a8d7-7dd8eece872f\") " Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.638176 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2df21b34-c4dc-4b4a-a8d7-7dd8eece872f-config\") pod \"2df21b34-c4dc-4b4a-a8d7-7dd8eece872f\" (UID: \"2df21b34-c4dc-4b4a-a8d7-7dd8eece872f\") " Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.638502 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2df21b34-c4dc-4b4a-a8d7-7dd8eece872f-config" (OuterVolumeSpecName: "config") pod "2df21b34-c4dc-4b4a-a8d7-7dd8eece872f" (UID: "2df21b34-c4dc-4b4a-a8d7-7dd8eece872f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.639924 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2df21b34-c4dc-4b4a-a8d7-7dd8eece872f-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.645522 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2df21b34-c4dc-4b4a-a8d7-7dd8eece872f-kube-api-access-zs5d2" (OuterVolumeSpecName: "kube-api-access-zs5d2") pod "2df21b34-c4dc-4b4a-a8d7-7dd8eece872f" (UID: "2df21b34-c4dc-4b4a-a8d7-7dd8eece872f"). InnerVolumeSpecName "kube-api-access-zs5d2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.736800 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.742311 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs5d2\" (UniqueName: \"kubernetes.io/projected/2df21b34-c4dc-4b4a-a8d7-7dd8eece872f-kube-api-access-zs5d2\") on node \"crc\" DevicePath \"\"" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.983701 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-j2dqr"] Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.985047 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-j2dqr" Oct 07 19:16:50 crc kubenswrapper[4825]: I1007 19:16:50.996465 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-j2dqr"] Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.003128 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.028386 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7","Type":"ContainerStarted","Data":"e84a94ad9112f8e396ee49ff0e61b07f927af22c97a875055b74476a677006a9"} Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.038107 4825 generic.go:334] "Generic (PLEG): container finished" podID="57eca3f0-2ba2-4cee-9317-b2157f602944" containerID="88747c9b773efbe2d6d1a5ee5b29480b3001c19bb98e79aa140e3b86e63dc2ff" exitCode=0 Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.038219 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-lc2dg" event={"ID":"57eca3f0-2ba2-4cee-9317-b2157f602944","Type":"ContainerDied","Data":"88747c9b773efbe2d6d1a5ee5b29480b3001c19bb98e79aa140e3b86e63dc2ff"} Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.043218 4825 generic.go:334] "Generic (PLEG): container finished" podID="f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6" containerID="6e9078d9dd0b938b3836bba75f2bd89a5c00e0bcaec1de46c704b6a36503a827" exitCode=0 Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.043376 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-98zzb" event={"ID":"f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6","Type":"ContainerDied","Data":"6e9078d9dd0b938b3836bba75f2bd89a5c00e0bcaec1de46c704b6a36503a827"} Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.046589 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9zcg2" event={"ID":"16ff2637-d49f-4b3b-b3f4-b731b51e8875","Type":"ContainerStarted","Data":"123abe3c19a9dfdf022185cfe2bf43cd133f331d29757219468c08b82b1269fb"} Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.048241 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-2ldhk" event={"ID":"2df21b34-c4dc-4b4a-a8d7-7dd8eece872f","Type":"ContainerDied","Data":"e1fcfa42b3d1bef822c232ee8b10efcf63734b4499853fffdc952604802196e1"} Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.048263 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2ldhk" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.049013 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-wbv9c" event={"ID":"0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376","Type":"ContainerDied","Data":"2e344504afc5a2163a5951b1944d3cff77796648b44e43f75a6836f3eb4046c6"} Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.049053 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-wbv9c" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.153024 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5476bb52-18e5-41e6-b087-3cd2d6e81a87-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-j2dqr\" (UID: \"5476bb52-18e5-41e6-b087-3cd2d6e81a87\") " pod="openstack/ovn-controller-metrics-j2dqr" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.153203 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5476bb52-18e5-41e6-b087-3cd2d6e81a87-ovs-rundir\") pod \"ovn-controller-metrics-j2dqr\" (UID: \"5476bb52-18e5-41e6-b087-3cd2d6e81a87\") " pod="openstack/ovn-controller-metrics-j2dqr" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.153246 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5476bb52-18e5-41e6-b087-3cd2d6e81a87-combined-ca-bundle\") pod \"ovn-controller-metrics-j2dqr\" (UID: \"5476bb52-18e5-41e6-b087-3cd2d6e81a87\") " pod="openstack/ovn-controller-metrics-j2dqr" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.153298 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5476bb52-18e5-41e6-b087-3cd2d6e81a87-config\") pod \"ovn-controller-metrics-j2dqr\" (UID: \"5476bb52-18e5-41e6-b087-3cd2d6e81a87\") " pod="openstack/ovn-controller-metrics-j2dqr" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.153373 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5476bb52-18e5-41e6-b087-3cd2d6e81a87-ovn-rundir\") pod \"ovn-controller-metrics-j2dqr\" (UID: \"5476bb52-18e5-41e6-b087-3cd2d6e81a87\") " pod="openstack/ovn-controller-metrics-j2dqr" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.153409 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrm8f\" (UniqueName: \"kubernetes.io/projected/5476bb52-18e5-41e6-b087-3cd2d6e81a87-kube-api-access-nrm8f\") pod \"ovn-controller-metrics-j2dqr\" (UID: \"5476bb52-18e5-41e6-b087-3cd2d6e81a87\") " pod="openstack/ovn-controller-metrics-j2dqr" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.256557 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5476bb52-18e5-41e6-b087-3cd2d6e81a87-ovs-rundir\") pod \"ovn-controller-metrics-j2dqr\" (UID: \"5476bb52-18e5-41e6-b087-3cd2d6e81a87\") " pod="openstack/ovn-controller-metrics-j2dqr" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.256614 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5476bb52-18e5-41e6-b087-3cd2d6e81a87-combined-ca-bundle\") pod \"ovn-controller-metrics-j2dqr\" (UID: \"5476bb52-18e5-41e6-b087-3cd2d6e81a87\") " pod="openstack/ovn-controller-metrics-j2dqr" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.256650 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5476bb52-18e5-41e6-b087-3cd2d6e81a87-config\") pod \"ovn-controller-metrics-j2dqr\" (UID: \"5476bb52-18e5-41e6-b087-3cd2d6e81a87\") " pod="openstack/ovn-controller-metrics-j2dqr" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.256693 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5476bb52-18e5-41e6-b087-3cd2d6e81a87-ovn-rundir\") pod \"ovn-controller-metrics-j2dqr\" (UID: \"5476bb52-18e5-41e6-b087-3cd2d6e81a87\") " pod="openstack/ovn-controller-metrics-j2dqr" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.256722 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrm8f\" (UniqueName: \"kubernetes.io/projected/5476bb52-18e5-41e6-b087-3cd2d6e81a87-kube-api-access-nrm8f\") pod \"ovn-controller-metrics-j2dqr\" (UID: \"5476bb52-18e5-41e6-b087-3cd2d6e81a87\") " pod="openstack/ovn-controller-metrics-j2dqr" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.256786 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5476bb52-18e5-41e6-b087-3cd2d6e81a87-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-j2dqr\" (UID: \"5476bb52-18e5-41e6-b087-3cd2d6e81a87\") " pod="openstack/ovn-controller-metrics-j2dqr" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.271827 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5476bb52-18e5-41e6-b087-3cd2d6e81a87-ovs-rundir\") pod \"ovn-controller-metrics-j2dqr\" (UID: \"5476bb52-18e5-41e6-b087-3cd2d6e81a87\") " pod="openstack/ovn-controller-metrics-j2dqr" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.276745 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5476bb52-18e5-41e6-b087-3cd2d6e81a87-config\") pod \"ovn-controller-metrics-j2dqr\" (UID: \"5476bb52-18e5-41e6-b087-3cd2d6e81a87\") " pod="openstack/ovn-controller-metrics-j2dqr" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.277880 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5476bb52-18e5-41e6-b087-3cd2d6e81a87-ovn-rundir\") pod \"ovn-controller-metrics-j2dqr\" (UID: \"5476bb52-18e5-41e6-b087-3cd2d6e81a87\") " pod="openstack/ovn-controller-metrics-j2dqr" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.284172 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5476bb52-18e5-41e6-b087-3cd2d6e81a87-combined-ca-bundle\") pod \"ovn-controller-metrics-j2dqr\" (UID: \"5476bb52-18e5-41e6-b087-3cd2d6e81a87\") " pod="openstack/ovn-controller-metrics-j2dqr" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.284199 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5476bb52-18e5-41e6-b087-3cd2d6e81a87-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-j2dqr\" (UID: \"5476bb52-18e5-41e6-b087-3cd2d6e81a87\") " pod="openstack/ovn-controller-metrics-j2dqr" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.302469 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2ldhk"] Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.346101 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrm8f\" (UniqueName: \"kubernetes.io/projected/5476bb52-18e5-41e6-b087-3cd2d6e81a87-kube-api-access-nrm8f\") pod \"ovn-controller-metrics-j2dqr\" (UID: \"5476bb52-18e5-41e6-b087-3cd2d6e81a87\") " pod="openstack/ovn-controller-metrics-j2dqr" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.357297 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2ldhk"] Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.371126 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-98zzb"] Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.440510 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wbv9c"] Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.449641 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wbv9c"] Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.458672 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-zvx7z"] Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.461531 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.471637 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.480457 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-zvx7z"] Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.502627 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lc2dg"] Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.518714 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-qgxph"] Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.521264 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.530578 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.552578 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-qgxph"] Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.566160 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd9j4\" (UniqueName: \"kubernetes.io/projected/56c7d5df-a107-481a-b89a-03a16e94f085-kube-api-access-qd9j4\") pod \"dnsmasq-dns-7fd796d7df-zvx7z\" (UID: \"56c7d5df-a107-481a-b89a-03a16e94f085\") " pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.566236 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56c7d5df-a107-481a-b89a-03a16e94f085-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-zvx7z\" (UID: \"56c7d5df-a107-481a-b89a-03a16e94f085\") " pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.566257 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56c7d5df-a107-481a-b89a-03a16e94f085-config\") pod \"dnsmasq-dns-7fd796d7df-zvx7z\" (UID: \"56c7d5df-a107-481a-b89a-03a16e94f085\") " pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.566278 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56c7d5df-a107-481a-b89a-03a16e94f085-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-zvx7z\" (UID: \"56c7d5df-a107-481a-b89a-03a16e94f085\") " pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.600948 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-j2dqr" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.668015 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56c7d5df-a107-481a-b89a-03a16e94f085-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-zvx7z\" (UID: \"56c7d5df-a107-481a-b89a-03a16e94f085\") " pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.668071 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56c7d5df-a107-481a-b89a-03a16e94f085-config\") pod \"dnsmasq-dns-7fd796d7df-zvx7z\" (UID: \"56c7d5df-a107-481a-b89a-03a16e94f085\") " pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.668107 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56c7d5df-a107-481a-b89a-03a16e94f085-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-zvx7z\" (UID: \"56c7d5df-a107-481a-b89a-03a16e94f085\") " pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.668162 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-qgxph\" (UID: \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\") " pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.668199 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h455\" (UniqueName: \"kubernetes.io/projected/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-kube-api-access-2h455\") pod \"dnsmasq-dns-86db49b7ff-qgxph\" (UID: \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\") " pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.668264 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-qgxph\" (UID: \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\") " pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.670387 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56c7d5df-a107-481a-b89a-03a16e94f085-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-zvx7z\" (UID: \"56c7d5df-a107-481a-b89a-03a16e94f085\") " pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.672419 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56c7d5df-a107-481a-b89a-03a16e94f085-config\") pod \"dnsmasq-dns-7fd796d7df-zvx7z\" (UID: \"56c7d5df-a107-481a-b89a-03a16e94f085\") " pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.672595 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-config\") pod \"dnsmasq-dns-86db49b7ff-qgxph\" (UID: \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\") " pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.672721 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-qgxph\" (UID: \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\") " pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.672778 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd9j4\" (UniqueName: \"kubernetes.io/projected/56c7d5df-a107-481a-b89a-03a16e94f085-kube-api-access-qd9j4\") pod \"dnsmasq-dns-7fd796d7df-zvx7z\" (UID: \"56c7d5df-a107-481a-b89a-03a16e94f085\") " pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.673637 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56c7d5df-a107-481a-b89a-03a16e94f085-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-zvx7z\" (UID: \"56c7d5df-a107-481a-b89a-03a16e94f085\") " pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.697935 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd9j4\" (UniqueName: \"kubernetes.io/projected/56c7d5df-a107-481a-b89a-03a16e94f085-kube-api-access-qd9j4\") pod \"dnsmasq-dns-7fd796d7df-zvx7z\" (UID: \"56c7d5df-a107-481a-b89a-03a16e94f085\") " pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.778085 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-qgxph\" (UID: \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\") " pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.778427 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-qgxph\" (UID: \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\") " pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.778468 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h455\" (UniqueName: \"kubernetes.io/projected/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-kube-api-access-2h455\") pod \"dnsmasq-dns-86db49b7ff-qgxph\" (UID: \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\") " pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.778545 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-qgxph\" (UID: \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\") " pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.778631 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-config\") pod \"dnsmasq-dns-86db49b7ff-qgxph\" (UID: \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\") " pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.779288 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-qgxph\" (UID: \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\") " pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.779927 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-config\") pod \"dnsmasq-dns-86db49b7ff-qgxph\" (UID: \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\") " pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.779928 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-qgxph\" (UID: \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\") " pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.780034 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-qgxph\" (UID: \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\") " pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.808189 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h455\" (UniqueName: \"kubernetes.io/projected/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-kube-api-access-2h455\") pod \"dnsmasq-dns-86db49b7ff-qgxph\" (UID: \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\") " pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.811839 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376" path="/var/lib/kubelet/pods/0e06dd3b-bb9d-49a5-91ff-fc8a28ea8376/volumes" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.812197 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2df21b34-c4dc-4b4a-a8d7-7dd8eece872f" path="/var/lib/kubelet/pods/2df21b34-c4dc-4b4a-a8d7-7dd8eece872f/volumes" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.812614 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" Oct 07 19:16:51 crc kubenswrapper[4825]: I1007 19:16:51.865808 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" Oct 07 19:16:52 crc kubenswrapper[4825]: I1007 19:16:52.063800 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bf48b556-d051-49b5-b9fb-fa6b325e0f79","Type":"ContainerStarted","Data":"83ea86433d6261bb48fd1563e51d6241eb1962973f66f3a3c50727562c6056c4"} Oct 07 19:16:52 crc kubenswrapper[4825]: I1007 19:16:52.805218 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-j2dqr"] Oct 07 19:16:53 crc kubenswrapper[4825]: I1007 19:16:53.140398 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-qgxph"] Oct 07 19:16:53 crc kubenswrapper[4825]: I1007 19:16:53.184888 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-zvx7z"] Oct 07 19:16:53 crc kubenswrapper[4825]: W1007 19:16:53.847313 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5476bb52_18e5_41e6_b087_3cd2d6e81a87.slice/crio-2d6fe634b4840c3dee8d209cfe16a7f160c9fa179da48227a6d4b31652f6db7d WatchSource:0}: Error finding container 2d6fe634b4840c3dee8d209cfe16a7f160c9fa179da48227a6d4b31652f6db7d: Status 404 returned error can't find the container with id 2d6fe634b4840c3dee8d209cfe16a7f160c9fa179da48227a6d4b31652f6db7d Oct 07 19:16:54 crc kubenswrapper[4825]: I1007 19:16:54.086765 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-j2dqr" event={"ID":"5476bb52-18e5-41e6-b087-3cd2d6e81a87","Type":"ContainerStarted","Data":"2d6fe634b4840c3dee8d209cfe16a7f160c9fa179da48227a6d4b31652f6db7d"} Oct 07 19:16:54 crc kubenswrapper[4825]: W1007 19:16:54.755218 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56c7d5df_a107_481a_b89a_03a16e94f085.slice/crio-99445c6f739864c70343084a9ecbad13fd860a44049dc49f0ca7813da071fbbc WatchSource:0}: Error finding container 99445c6f739864c70343084a9ecbad13fd860a44049dc49f0ca7813da071fbbc: Status 404 returned error can't find the container with id 99445c6f739864c70343084a9ecbad13fd860a44049dc49f0ca7813da071fbbc Oct 07 19:16:54 crc kubenswrapper[4825]: W1007 19:16:54.762263 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad9695e9_6d2d_42a0_85fe_a11f50c148e4.slice/crio-9af90fa9094b01967cdde759ed0e5597e44ecdef86276c27bc5f619a3f3ee7db WatchSource:0}: Error finding container 9af90fa9094b01967cdde759ed0e5597e44ecdef86276c27bc5f619a3f3ee7db: Status 404 returned error can't find the container with id 9af90fa9094b01967cdde759ed0e5597e44ecdef86276c27bc5f619a3f3ee7db Oct 07 19:16:55 crc kubenswrapper[4825]: I1007 19:16:55.094691 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" event={"ID":"56c7d5df-a107-481a-b89a-03a16e94f085","Type":"ContainerStarted","Data":"99445c6f739864c70343084a9ecbad13fd860a44049dc49f0ca7813da071fbbc"} Oct 07 19:16:55 crc kubenswrapper[4825]: I1007 19:16:55.095890 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" event={"ID":"ad9695e9-6d2d-42a0-85fe-a11f50c148e4","Type":"ContainerStarted","Data":"9af90fa9094b01967cdde759ed0e5597e44ecdef86276c27bc5f619a3f3ee7db"} Oct 07 19:17:02 crc kubenswrapper[4825]: I1007 19:17:02.149600 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-98zzb" event={"ID":"f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6","Type":"ContainerStarted","Data":"9bb596719b7860d14cea532bbcf0e4c41a9d1fc0bb2dd5474cf0b63c324a1ce9"} Oct 07 19:17:02 crc kubenswrapper[4825]: I1007 19:17:02.150144 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-98zzb" Oct 07 19:17:02 crc kubenswrapper[4825]: I1007 19:17:02.149640 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-98zzb" podUID="f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6" containerName="dnsmasq-dns" containerID="cri-o://9bb596719b7860d14cea532bbcf0e4c41a9d1fc0bb2dd5474cf0b63c324a1ce9" gracePeriod=10 Oct 07 19:17:02 crc kubenswrapper[4825]: I1007 19:17:02.154662 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-lc2dg" event={"ID":"57eca3f0-2ba2-4cee-9317-b2157f602944","Type":"ContainerStarted","Data":"3b84d0931612a1e93c4132303b40b31f1cfc646d94147f505854b40b58abb5a7"} Oct 07 19:17:02 crc kubenswrapper[4825]: I1007 19:17:02.154803 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-lc2dg" podUID="57eca3f0-2ba2-4cee-9317-b2157f602944" containerName="dnsmasq-dns" containerID="cri-o://3b84d0931612a1e93c4132303b40b31f1cfc646d94147f505854b40b58abb5a7" gracePeriod=10 Oct 07 19:17:02 crc kubenswrapper[4825]: I1007 19:17:02.155006 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-lc2dg" Oct 07 19:17:02 crc kubenswrapper[4825]: I1007 19:17:02.179862 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-98zzb" podStartSLOduration=26.647445551 podStartE2EDuration="27.1798445s" podCreationTimestamp="2025-10-07 19:16:35 +0000 UTC" firstStartedPulling="2025-10-07 19:16:49.614516088 +0000 UTC m=+998.436554735" lastFinishedPulling="2025-10-07 19:16:50.146915047 +0000 UTC m=+998.968953684" observedRunningTime="2025-10-07 19:17:02.178853849 +0000 UTC m=+1011.000892486" watchObservedRunningTime="2025-10-07 19:17:02.1798445 +0000 UTC m=+1011.001883127" Oct 07 19:17:02 crc kubenswrapper[4825]: I1007 19:17:02.214992 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-lc2dg" podStartSLOduration=26.667793001 podStartE2EDuration="27.214977569s" podCreationTimestamp="2025-10-07 19:16:35 +0000 UTC" firstStartedPulling="2025-10-07 19:16:49.61163156 +0000 UTC m=+998.433670187" lastFinishedPulling="2025-10-07 19:16:50.158816118 +0000 UTC m=+998.980854755" observedRunningTime="2025-10-07 19:17:02.210974907 +0000 UTC m=+1011.033013544" watchObservedRunningTime="2025-10-07 19:17:02.214977569 +0000 UTC m=+1011.037016206" Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.169869 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f43a8cb5-b546-476e-a429-12947216e9b0","Type":"ContainerStarted","Data":"e7089db7df0abbff50c6bfa62c1f82416a51470cf96c4044c29f1c4a871a3adc"} Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.176674 4825 generic.go:334] "Generic (PLEG): container finished" podID="57eca3f0-2ba2-4cee-9317-b2157f602944" containerID="3b84d0931612a1e93c4132303b40b31f1cfc646d94147f505854b40b58abb5a7" exitCode=0 Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.176759 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-lc2dg" event={"ID":"57eca3f0-2ba2-4cee-9317-b2157f602944","Type":"ContainerDied","Data":"3b84d0931612a1e93c4132303b40b31f1cfc646d94147f505854b40b58abb5a7"} Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.176793 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-lc2dg" event={"ID":"57eca3f0-2ba2-4cee-9317-b2157f602944","Type":"ContainerDied","Data":"fa7e9189dc6ecce8326a41c90a2e29d7f7ae351ad487d1f2838ecd878845f033"} Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.176814 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa7e9189dc6ecce8326a41c90a2e29d7f7ae351ad487d1f2838ecd878845f033" Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.180823 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.184069 4825 generic.go:334] "Generic (PLEG): container finished" podID="f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6" containerID="9bb596719b7860d14cea532bbcf0e4c41a9d1fc0bb2dd5474cf0b63c324a1ce9" exitCode=0 Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.184169 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-98zzb" event={"ID":"f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6","Type":"ContainerDied","Data":"9bb596719b7860d14cea532bbcf0e4c41a9d1fc0bb2dd5474cf0b63c324a1ce9"} Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.184206 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-98zzb" event={"ID":"f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6","Type":"ContainerDied","Data":"52467f1ec6e75be5ba6c3572e53785c3b8530b99a2ffb5647b45c4c72bc1fd45"} Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.184239 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52467f1ec6e75be5ba6c3572e53785c3b8530b99a2ffb5647b45c4c72bc1fd45" Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.186905 4825 generic.go:334] "Generic (PLEG): container finished" podID="56c7d5df-a107-481a-b89a-03a16e94f085" containerID="a208e8e94956cb910abe7797f4d6d4882d303fd3083ed5b2bed25104b385eff7" exitCode=0 Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.186961 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" event={"ID":"56c7d5df-a107-481a-b89a-03a16e94f085","Type":"ContainerDied","Data":"a208e8e94956cb910abe7797f4d6d4882d303fd3083ed5b2bed25104b385eff7"} Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.188953 4825 generic.go:334] "Generic (PLEG): container finished" podID="ad9695e9-6d2d-42a0-85fe-a11f50c148e4" containerID="7e59d8dcf7dafd81bb2486a694edef9415b0078259a26c493d5b6d2c90792c75" exitCode=0 Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.189019 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" event={"ID":"ad9695e9-6d2d-42a0-85fe-a11f50c148e4","Type":"ContainerDied","Data":"7e59d8dcf7dafd81bb2486a694edef9415b0078259a26c493d5b6d2c90792c75"} Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.275751 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=11.941565194 podStartE2EDuration="23.275735614s" podCreationTimestamp="2025-10-07 19:16:40 +0000 UTC" firstStartedPulling="2025-10-07 19:16:49.896328787 +0000 UTC m=+998.718367424" lastFinishedPulling="2025-10-07 19:17:01.230499207 +0000 UTC m=+1010.052537844" observedRunningTime="2025-10-07 19:17:03.269596276 +0000 UTC m=+1012.091634923" watchObservedRunningTime="2025-10-07 19:17:03.275735614 +0000 UTC m=+1012.097774251" Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.715616 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-lc2dg" Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.822469 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57eca3f0-2ba2-4cee-9317-b2157f602944-dns-svc\") pod \"57eca3f0-2ba2-4cee-9317-b2157f602944\" (UID: \"57eca3f0-2ba2-4cee-9317-b2157f602944\") " Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.822538 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wskfm\" (UniqueName: \"kubernetes.io/projected/57eca3f0-2ba2-4cee-9317-b2157f602944-kube-api-access-wskfm\") pod \"57eca3f0-2ba2-4cee-9317-b2157f602944\" (UID: \"57eca3f0-2ba2-4cee-9317-b2157f602944\") " Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.822631 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57eca3f0-2ba2-4cee-9317-b2157f602944-config\") pod \"57eca3f0-2ba2-4cee-9317-b2157f602944\" (UID: \"57eca3f0-2ba2-4cee-9317-b2157f602944\") " Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.830244 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57eca3f0-2ba2-4cee-9317-b2157f602944-kube-api-access-wskfm" (OuterVolumeSpecName: "kube-api-access-wskfm") pod "57eca3f0-2ba2-4cee-9317-b2157f602944" (UID: "57eca3f0-2ba2-4cee-9317-b2157f602944"). InnerVolumeSpecName "kube-api-access-wskfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.879152 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57eca3f0-2ba2-4cee-9317-b2157f602944-config" (OuterVolumeSpecName: "config") pod "57eca3f0-2ba2-4cee-9317-b2157f602944" (UID: "57eca3f0-2ba2-4cee-9317-b2157f602944"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.884039 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57eca3f0-2ba2-4cee-9317-b2157f602944-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "57eca3f0-2ba2-4cee-9317-b2157f602944" (UID: "57eca3f0-2ba2-4cee-9317-b2157f602944"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.924268 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57eca3f0-2ba2-4cee-9317-b2157f602944-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.924298 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wskfm\" (UniqueName: \"kubernetes.io/projected/57eca3f0-2ba2-4cee-9317-b2157f602944-kube-api-access-wskfm\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.924310 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57eca3f0-2ba2-4cee-9317-b2157f602944-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:03 crc kubenswrapper[4825]: I1007 19:17:03.933606 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-98zzb" Oct 07 19:17:04 crc kubenswrapper[4825]: I1007 19:17:04.025994 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6-config\") pod \"f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6\" (UID: \"f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6\") " Oct 07 19:17:04 crc kubenswrapper[4825]: I1007 19:17:04.026517 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chrzv\" (UniqueName: \"kubernetes.io/projected/f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6-kube-api-access-chrzv\") pod \"f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6\" (UID: \"f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6\") " Oct 07 19:17:04 crc kubenswrapper[4825]: I1007 19:17:04.026852 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6-dns-svc\") pod \"f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6\" (UID: \"f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6\") " Oct 07 19:17:04 crc kubenswrapper[4825]: I1007 19:17:04.029918 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6-kube-api-access-chrzv" (OuterVolumeSpecName: "kube-api-access-chrzv") pod "f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6" (UID: "f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6"). InnerVolumeSpecName "kube-api-access-chrzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:17:04 crc kubenswrapper[4825]: I1007 19:17:04.062029 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6" (UID: "f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:17:04 crc kubenswrapper[4825]: I1007 19:17:04.065340 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6-config" (OuterVolumeSpecName: "config") pod "f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6" (UID: "f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:17:04 crc kubenswrapper[4825]: I1007 19:17:04.130823 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chrzv\" (UniqueName: \"kubernetes.io/projected/f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6-kube-api-access-chrzv\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:04 crc kubenswrapper[4825]: I1007 19:17:04.131265 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:04 crc kubenswrapper[4825]: I1007 19:17:04.131303 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:04 crc kubenswrapper[4825]: I1007 19:17:04.201116 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6451a8c0-c6b1-4098-846d-24fe8c26d849","Type":"ContainerStarted","Data":"5fad2aaae2dcfda8767d5c76ee1dbab1a50e6b87c5c95d4d5f02145de9b30182"} Oct 07 19:17:04 crc kubenswrapper[4825]: I1007 19:17:04.202622 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"19bd5f67-ab1b-4816-8e44-f792ea626299","Type":"ContainerStarted","Data":"1e33937e19cfcaed27b27c9c08b402c1fd53714ac47cb052f08e88906bd93bad"} Oct 07 19:17:04 crc kubenswrapper[4825]: I1007 19:17:04.203732 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"8448c74b-bea3-42c0-95da-ab251a90ca9f","Type":"ContainerStarted","Data":"f9040d377ba9291b104d7b82a1902ea5fa2fc91c86b7c9758acdc56745635a99"} Oct 07 19:17:04 crc kubenswrapper[4825]: I1007 19:17:04.203882 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-98zzb" Oct 07 19:17:04 crc kubenswrapper[4825]: I1007 19:17:04.204280 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-lc2dg" Oct 07 19:17:04 crc kubenswrapper[4825]: I1007 19:17:04.272401 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-98zzb"] Oct 07 19:17:04 crc kubenswrapper[4825]: I1007 19:17:04.291076 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-98zzb"] Oct 07 19:17:04 crc kubenswrapper[4825]: I1007 19:17:04.309053 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lc2dg"] Oct 07 19:17:04 crc kubenswrapper[4825]: I1007 19:17:04.334065 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lc2dg"] Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.213870 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bf48b556-d051-49b5-b9fb-fa6b325e0f79","Type":"ContainerStarted","Data":"ce9df764698cc7495c1ed3297bc8b9c51a7bcfd8faa1afdf1f518bc1202572b2"} Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.214373 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bf48b556-d051-49b5-b9fb-fa6b325e0f79","Type":"ContainerStarted","Data":"93ecff6b756889b656cac32b7962ca4f952333080e5a5b1428b9f765fce67f60"} Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.216148 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" event={"ID":"56c7d5df-a107-481a-b89a-03a16e94f085","Type":"ContainerStarted","Data":"dacca8678e574cc2de925ee561d50499af93f431b0fd7432e6dc307e7ba7437e"} Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.216408 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.218048 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" event={"ID":"ad9695e9-6d2d-42a0-85fe-a11f50c148e4","Type":"ContainerStarted","Data":"436f6f20e36edd30a0a1b9e7e2e46cd400abd457c5e02ee096f451104d9fb5d5"} Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.218283 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.220025 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9","Type":"ContainerStarted","Data":"fb7feff4407295b1ec22b4c10368bc8c0c0d8826a4d057372c1b923635385321"} Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.222406 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-j2dqr" event={"ID":"5476bb52-18e5-41e6-b087-3cd2d6e81a87","Type":"ContainerStarted","Data":"10a6db8842c6f0b5e01ae8efc67fbb9cff5cc92d159e21e567a22ace9a0ab2d8"} Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.224639 4825 generic.go:334] "Generic (PLEG): container finished" podID="16ff2637-d49f-4b3b-b3f4-b731b51e8875" containerID="340954f9c84bdd84ba9213058fee4c0bb7a95ed8dffeb436ee38d528ea1da70f" exitCode=0 Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.224692 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9zcg2" event={"ID":"16ff2637-d49f-4b3b-b3f4-b731b51e8875","Type":"ContainerDied","Data":"340954f9c84bdd84ba9213058fee4c0bb7a95ed8dffeb436ee38d528ea1da70f"} Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.227729 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mqtlv" event={"ID":"0392f085-cd23-439c-b8aa-e3c94fc320b8","Type":"ContainerStarted","Data":"28eda112bae91cdd7ba5f909e1763a352c2b87b3cf0f842f0601c9fb27c4d36e"} Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.227863 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-mqtlv" Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.229921 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7","Type":"ContainerStarted","Data":"85f13fa1442fcd1dc6082b907b41c638646703cce0e2ee9931f86cb8624f0834"} Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.229984 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"96ff0bc5-e277-4f6a-a3b3-815e01ac42b7","Type":"ContainerStarted","Data":"d6cf7371d5241811d1e137ec6002fb8ef45c6efdf98977e0793ca29c6a8dd42d"} Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.232937 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"66c0c344-091c-42cf-bfbb-bbdc83a37bce","Type":"ContainerStarted","Data":"a9869f60b400315b68c3bb0bd77d641bee19275a612ab02aef9c433afc62f275"} Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.233702 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.248086 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=7.23224974 podStartE2EDuration="17.248060982s" podCreationTimestamp="2025-10-07 19:16:48 +0000 UTC" firstStartedPulling="2025-10-07 19:16:51.656681091 +0000 UTC m=+1000.478719728" lastFinishedPulling="2025-10-07 19:17:01.672492333 +0000 UTC m=+1010.494530970" observedRunningTime="2025-10-07 19:17:05.242484412 +0000 UTC m=+1014.064523109" watchObservedRunningTime="2025-10-07 19:17:05.248060982 +0000 UTC m=+1014.070099659" Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.282104 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-mqtlv" podStartSLOduration=8.503785795 podStartE2EDuration="20.282082196s" podCreationTimestamp="2025-10-07 19:16:45 +0000 UTC" firstStartedPulling="2025-10-07 19:16:49.894240074 +0000 UTC m=+998.716278711" lastFinishedPulling="2025-10-07 19:17:01.672536475 +0000 UTC m=+1010.494575112" observedRunningTime="2025-10-07 19:17:05.265263674 +0000 UTC m=+1014.087302301" watchObservedRunningTime="2025-10-07 19:17:05.282082196 +0000 UTC m=+1014.104120843" Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.317876 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=8.63628387 podStartE2EDuration="20.317848296s" podCreationTimestamp="2025-10-07 19:16:45 +0000 UTC" firstStartedPulling="2025-10-07 19:16:49.991268828 +0000 UTC m=+998.813307465" lastFinishedPulling="2025-10-07 19:17:01.672833254 +0000 UTC m=+1010.494871891" observedRunningTime="2025-10-07 19:17:05.309714559 +0000 UTC m=+1014.131753216" watchObservedRunningTime="2025-10-07 19:17:05.317848296 +0000 UTC m=+1014.139886973" Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.337298 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=11.300165357000001 podStartE2EDuration="23.337217596s" podCreationTimestamp="2025-10-07 19:16:42 +0000 UTC" firstStartedPulling="2025-10-07 19:16:49.627578085 +0000 UTC m=+998.449616732" lastFinishedPulling="2025-10-07 19:17:01.664630334 +0000 UTC m=+1010.486668971" observedRunningTime="2025-10-07 19:17:05.332008297 +0000 UTC m=+1014.154046944" watchObservedRunningTime="2025-10-07 19:17:05.337217596 +0000 UTC m=+1014.159256233" Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.349127 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" podStartSLOduration=14.349107028 podStartE2EDuration="14.349107028s" podCreationTimestamp="2025-10-07 19:16:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:17:05.347728706 +0000 UTC m=+1014.169767343" watchObservedRunningTime="2025-10-07 19:17:05.349107028 +0000 UTC m=+1014.171145665" Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.413249 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" podStartSLOduration=14.41321106 podStartE2EDuration="14.41321106s" podCreationTimestamp="2025-10-07 19:16:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:17:05.39057554 +0000 UTC m=+1014.212614177" watchObservedRunningTime="2025-10-07 19:17:05.41321106 +0000 UTC m=+1014.235249697" Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.463918 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-j2dqr" podStartSLOduration=7.47004497 podStartE2EDuration="15.463891423s" podCreationTimestamp="2025-10-07 19:16:50 +0000 UTC" firstStartedPulling="2025-10-07 19:16:53.849595525 +0000 UTC m=+1002.671634162" lastFinishedPulling="2025-10-07 19:17:01.843441938 +0000 UTC m=+1010.665480615" observedRunningTime="2025-10-07 19:17:05.418755948 +0000 UTC m=+1014.240794585" watchObservedRunningTime="2025-10-07 19:17:05.463891423 +0000 UTC m=+1014.285930060" Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.708477 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.708746 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.708790 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.709394 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"906a228c17f7770f9388dbe04c2d4927f00d2c55a2ece21a3cd466abc03da78e"} pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.709449 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" containerID="cri-o://906a228c17f7770f9388dbe04c2d4927f00d2c55a2ece21a3cd466abc03da78e" gracePeriod=600 Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.806101 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57eca3f0-2ba2-4cee-9317-b2157f602944" path="/var/lib/kubelet/pods/57eca3f0-2ba2-4cee-9317-b2157f602944/volumes" Oct 07 19:17:05 crc kubenswrapper[4825]: I1007 19:17:05.807288 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6" path="/var/lib/kubelet/pods/f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6/volumes" Oct 07 19:17:06 crc kubenswrapper[4825]: I1007 19:17:06.260607 4825 generic.go:334] "Generic (PLEG): container finished" podID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerID="906a228c17f7770f9388dbe04c2d4927f00d2c55a2ece21a3cd466abc03da78e" exitCode=0 Oct 07 19:17:06 crc kubenswrapper[4825]: I1007 19:17:06.260693 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerDied","Data":"906a228c17f7770f9388dbe04c2d4927f00d2c55a2ece21a3cd466abc03da78e"} Oct 07 19:17:06 crc kubenswrapper[4825]: I1007 19:17:06.261956 4825 scope.go:117] "RemoveContainer" containerID="1aff985c4d465af81432b2c0fd1da9cb01f3378b2087e04530a854de44547a92" Oct 07 19:17:06 crc kubenswrapper[4825]: I1007 19:17:06.879562 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 07 19:17:07 crc kubenswrapper[4825]: I1007 19:17:07.891821 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 07 19:17:07 crc kubenswrapper[4825]: I1007 19:17:07.967575 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 07 19:17:08 crc kubenswrapper[4825]: I1007 19:17:08.176217 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 07 19:17:08 crc kubenswrapper[4825]: I1007 19:17:08.234827 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 07 19:17:08 crc kubenswrapper[4825]: I1007 19:17:08.279843 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.291665 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9zcg2" event={"ID":"16ff2637-d49f-4b3b-b3f4-b731b51e8875","Type":"ContainerStarted","Data":"dabd787159d300c326abbbc4615b1b2c9b481b5ed487e5ad3717417da45b6930"} Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.369686 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.373808 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.676114 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 07 19:17:09 crc kubenswrapper[4825]: E1007 19:17:09.676752 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57eca3f0-2ba2-4cee-9317-b2157f602944" containerName="dnsmasq-dns" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.676767 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="57eca3f0-2ba2-4cee-9317-b2157f602944" containerName="dnsmasq-dns" Oct 07 19:17:09 crc kubenswrapper[4825]: E1007 19:17:09.676779 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6" containerName="init" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.676786 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6" containerName="init" Oct 07 19:17:09 crc kubenswrapper[4825]: E1007 19:17:09.676799 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57eca3f0-2ba2-4cee-9317-b2157f602944" containerName="init" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.676805 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="57eca3f0-2ba2-4cee-9317-b2157f602944" containerName="init" Oct 07 19:17:09 crc kubenswrapper[4825]: E1007 19:17:09.676814 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6" containerName="dnsmasq-dns" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.676820 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6" containerName="dnsmasq-dns" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.676990 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="57eca3f0-2ba2-4cee-9317-b2157f602944" containerName="dnsmasq-dns" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.677003 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4c90977-0a3e-4b6c-9a2b-1379cb0d97f6" containerName="dnsmasq-dns" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.678194 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.680164 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.680454 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.680594 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-p7k55" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.690920 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.691126 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.736447 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58\") " pod="openstack/ovn-northd-0" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.736490 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58-config\") pod \"ovn-northd-0\" (UID: \"5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58\") " pod="openstack/ovn-northd-0" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.736533 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58\") " pod="openstack/ovn-northd-0" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.736570 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58\") " pod="openstack/ovn-northd-0" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.736610 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58\") " pod="openstack/ovn-northd-0" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.736651 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5zkn\" (UniqueName: \"kubernetes.io/projected/5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58-kube-api-access-k5zkn\") pod \"ovn-northd-0\" (UID: \"5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58\") " pod="openstack/ovn-northd-0" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.736673 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58-scripts\") pod \"ovn-northd-0\" (UID: \"5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58\") " pod="openstack/ovn-northd-0" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.837843 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58\") " pod="openstack/ovn-northd-0" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.837905 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58\") " pod="openstack/ovn-northd-0" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.837943 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58\") " pod="openstack/ovn-northd-0" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.837986 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5zkn\" (UniqueName: \"kubernetes.io/projected/5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58-kube-api-access-k5zkn\") pod \"ovn-northd-0\" (UID: \"5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58\") " pod="openstack/ovn-northd-0" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.838007 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58-scripts\") pod \"ovn-northd-0\" (UID: \"5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58\") " pod="openstack/ovn-northd-0" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.838035 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58\") " pod="openstack/ovn-northd-0" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.838062 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58-config\") pod \"ovn-northd-0\" (UID: \"5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58\") " pod="openstack/ovn-northd-0" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.838541 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58\") " pod="openstack/ovn-northd-0" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.839325 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58-config\") pod \"ovn-northd-0\" (UID: \"5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58\") " pod="openstack/ovn-northd-0" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.839437 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58-scripts\") pod \"ovn-northd-0\" (UID: \"5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58\") " pod="openstack/ovn-northd-0" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.845157 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58\") " pod="openstack/ovn-northd-0" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.846057 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58\") " pod="openstack/ovn-northd-0" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.858986 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58\") " pod="openstack/ovn-northd-0" Oct 07 19:17:09 crc kubenswrapper[4825]: I1007 19:17:09.861356 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5zkn\" (UniqueName: \"kubernetes.io/projected/5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58-kube-api-access-k5zkn\") pod \"ovn-northd-0\" (UID: \"5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58\") " pod="openstack/ovn-northd-0" Oct 07 19:17:10 crc kubenswrapper[4825]: I1007 19:17:10.007599 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 07 19:17:10 crc kubenswrapper[4825]: I1007 19:17:10.503295 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 07 19:17:10 crc kubenswrapper[4825]: I1007 19:17:10.569352 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 07 19:17:11 crc kubenswrapper[4825]: I1007 19:17:11.308666 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58","Type":"ContainerStarted","Data":"5dfcb2eae322aecf8232f7805b5b23bc481bc42663e9cd916779e188f6936f68"} Oct 07 19:17:11 crc kubenswrapper[4825]: I1007 19:17:11.311685 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerStarted","Data":"7a129d547f9c2f005540980fa89f701d13b633e45c1d0e5a234b2420081b437f"} Oct 07 19:17:11 crc kubenswrapper[4825]: I1007 19:17:11.314985 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9zcg2" event={"ID":"16ff2637-d49f-4b3b-b3f4-b731b51e8875","Type":"ContainerStarted","Data":"1744e46d3de5b7d2deb90fe32825662b6c6518e67c70346388369d7f817270b8"} Oct 07 19:17:11 crc kubenswrapper[4825]: I1007 19:17:11.315148 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9zcg2" Oct 07 19:17:11 crc kubenswrapper[4825]: I1007 19:17:11.315247 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9zcg2" Oct 07 19:17:11 crc kubenswrapper[4825]: I1007 19:17:11.360084 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-9zcg2" podStartSLOduration=15.021065544 podStartE2EDuration="26.36006692s" podCreationTimestamp="2025-10-07 19:16:45 +0000 UTC" firstStartedPulling="2025-10-07 19:16:50.116577813 +0000 UTC m=+998.938616450" lastFinishedPulling="2025-10-07 19:17:01.455579179 +0000 UTC m=+1010.277617826" observedRunningTime="2025-10-07 19:17:11.356189393 +0000 UTC m=+1020.178228030" watchObservedRunningTime="2025-10-07 19:17:11.36006692 +0000 UTC m=+1020.182105567" Oct 07 19:17:11 crc kubenswrapper[4825]: I1007 19:17:11.814370 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" Oct 07 19:17:11 crc kubenswrapper[4825]: I1007 19:17:11.867429 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" Oct 07 19:17:11 crc kubenswrapper[4825]: I1007 19:17:11.937171 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-zvx7z"] Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.325073 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58","Type":"ContainerStarted","Data":"2bdc95846ace1194dbe513fa96ee44716555f5eba4d7de093eff1fd47d041ee4"} Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.325118 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58","Type":"ContainerStarted","Data":"ffd555fd3ec87b2ed40c90788130a0e02e1db46b7ec550186df45ba8bec897a4"} Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.326049 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" podUID="56c7d5df-a107-481a-b89a-03a16e94f085" containerName="dnsmasq-dns" containerID="cri-o://dacca8678e574cc2de925ee561d50499af93f431b0fd7432e6dc307e7ba7437e" gracePeriod=10 Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.326219 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.349311 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.040371708 podStartE2EDuration="3.349292618s" podCreationTimestamp="2025-10-07 19:17:09 +0000 UTC" firstStartedPulling="2025-10-07 19:17:10.515933751 +0000 UTC m=+1019.337972388" lastFinishedPulling="2025-10-07 19:17:11.824854661 +0000 UTC m=+1020.646893298" observedRunningTime="2025-10-07 19:17:12.345802532 +0000 UTC m=+1021.167841189" watchObservedRunningTime="2025-10-07 19:17:12.349292618 +0000 UTC m=+1021.171331275" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.474577 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.549679 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-9vwm4"] Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.551041 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-9vwm4" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.570176 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-9vwm4"] Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.583092 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41179ae7-4ff4-4c39-81d2-8867a63917e6-dns-svc\") pod \"dnsmasq-dns-698758b865-9vwm4\" (UID: \"41179ae7-4ff4-4c39-81d2-8867a63917e6\") " pod="openstack/dnsmasq-dns-698758b865-9vwm4" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.583182 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41179ae7-4ff4-4c39-81d2-8867a63917e6-config\") pod \"dnsmasq-dns-698758b865-9vwm4\" (UID: \"41179ae7-4ff4-4c39-81d2-8867a63917e6\") " pod="openstack/dnsmasq-dns-698758b865-9vwm4" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.583452 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxmfk\" (UniqueName: \"kubernetes.io/projected/41179ae7-4ff4-4c39-81d2-8867a63917e6-kube-api-access-gxmfk\") pod \"dnsmasq-dns-698758b865-9vwm4\" (UID: \"41179ae7-4ff4-4c39-81d2-8867a63917e6\") " pod="openstack/dnsmasq-dns-698758b865-9vwm4" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.583523 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41179ae7-4ff4-4c39-81d2-8867a63917e6-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-9vwm4\" (UID: \"41179ae7-4ff4-4c39-81d2-8867a63917e6\") " pod="openstack/dnsmasq-dns-698758b865-9vwm4" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.583543 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41179ae7-4ff4-4c39-81d2-8867a63917e6-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-9vwm4\" (UID: \"41179ae7-4ff4-4c39-81d2-8867a63917e6\") " pod="openstack/dnsmasq-dns-698758b865-9vwm4" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.684922 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxmfk\" (UniqueName: \"kubernetes.io/projected/41179ae7-4ff4-4c39-81d2-8867a63917e6-kube-api-access-gxmfk\") pod \"dnsmasq-dns-698758b865-9vwm4\" (UID: \"41179ae7-4ff4-4c39-81d2-8867a63917e6\") " pod="openstack/dnsmasq-dns-698758b865-9vwm4" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.685284 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41179ae7-4ff4-4c39-81d2-8867a63917e6-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-9vwm4\" (UID: \"41179ae7-4ff4-4c39-81d2-8867a63917e6\") " pod="openstack/dnsmasq-dns-698758b865-9vwm4" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.685302 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41179ae7-4ff4-4c39-81d2-8867a63917e6-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-9vwm4\" (UID: \"41179ae7-4ff4-4c39-81d2-8867a63917e6\") " pod="openstack/dnsmasq-dns-698758b865-9vwm4" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.685365 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41179ae7-4ff4-4c39-81d2-8867a63917e6-dns-svc\") pod \"dnsmasq-dns-698758b865-9vwm4\" (UID: \"41179ae7-4ff4-4c39-81d2-8867a63917e6\") " pod="openstack/dnsmasq-dns-698758b865-9vwm4" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.685395 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41179ae7-4ff4-4c39-81d2-8867a63917e6-config\") pod \"dnsmasq-dns-698758b865-9vwm4\" (UID: \"41179ae7-4ff4-4c39-81d2-8867a63917e6\") " pod="openstack/dnsmasq-dns-698758b865-9vwm4" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.686852 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41179ae7-4ff4-4c39-81d2-8867a63917e6-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-9vwm4\" (UID: \"41179ae7-4ff4-4c39-81d2-8867a63917e6\") " pod="openstack/dnsmasq-dns-698758b865-9vwm4" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.687022 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41179ae7-4ff4-4c39-81d2-8867a63917e6-config\") pod \"dnsmasq-dns-698758b865-9vwm4\" (UID: \"41179ae7-4ff4-4c39-81d2-8867a63917e6\") " pod="openstack/dnsmasq-dns-698758b865-9vwm4" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.687565 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41179ae7-4ff4-4c39-81d2-8867a63917e6-dns-svc\") pod \"dnsmasq-dns-698758b865-9vwm4\" (UID: \"41179ae7-4ff4-4c39-81d2-8867a63917e6\") " pod="openstack/dnsmasq-dns-698758b865-9vwm4" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.687900 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41179ae7-4ff4-4c39-81d2-8867a63917e6-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-9vwm4\" (UID: \"41179ae7-4ff4-4c39-81d2-8867a63917e6\") " pod="openstack/dnsmasq-dns-698758b865-9vwm4" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.710783 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxmfk\" (UniqueName: \"kubernetes.io/projected/41179ae7-4ff4-4c39-81d2-8867a63917e6-kube-api-access-gxmfk\") pod \"dnsmasq-dns-698758b865-9vwm4\" (UID: \"41179ae7-4ff4-4c39-81d2-8867a63917e6\") " pod="openstack/dnsmasq-dns-698758b865-9vwm4" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.823666 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.876703 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-9vwm4" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.888042 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56c7d5df-a107-481a-b89a-03a16e94f085-config\") pod \"56c7d5df-a107-481a-b89a-03a16e94f085\" (UID: \"56c7d5df-a107-481a-b89a-03a16e94f085\") " Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.888180 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qd9j4\" (UniqueName: \"kubernetes.io/projected/56c7d5df-a107-481a-b89a-03a16e94f085-kube-api-access-qd9j4\") pod \"56c7d5df-a107-481a-b89a-03a16e94f085\" (UID: \"56c7d5df-a107-481a-b89a-03a16e94f085\") " Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.888206 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56c7d5df-a107-481a-b89a-03a16e94f085-dns-svc\") pod \"56c7d5df-a107-481a-b89a-03a16e94f085\" (UID: \"56c7d5df-a107-481a-b89a-03a16e94f085\") " Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.888361 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56c7d5df-a107-481a-b89a-03a16e94f085-ovsdbserver-nb\") pod \"56c7d5df-a107-481a-b89a-03a16e94f085\" (UID: \"56c7d5df-a107-481a-b89a-03a16e94f085\") " Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.891829 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56c7d5df-a107-481a-b89a-03a16e94f085-kube-api-access-qd9j4" (OuterVolumeSpecName: "kube-api-access-qd9j4") pod "56c7d5df-a107-481a-b89a-03a16e94f085" (UID: "56c7d5df-a107-481a-b89a-03a16e94f085"). InnerVolumeSpecName "kube-api-access-qd9j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.941567 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56c7d5df-a107-481a-b89a-03a16e94f085-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "56c7d5df-a107-481a-b89a-03a16e94f085" (UID: "56c7d5df-a107-481a-b89a-03a16e94f085"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.956066 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56c7d5df-a107-481a-b89a-03a16e94f085-config" (OuterVolumeSpecName: "config") pod "56c7d5df-a107-481a-b89a-03a16e94f085" (UID: "56c7d5df-a107-481a-b89a-03a16e94f085"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.957852 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56c7d5df-a107-481a-b89a-03a16e94f085-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "56c7d5df-a107-481a-b89a-03a16e94f085" (UID: "56c7d5df-a107-481a-b89a-03a16e94f085"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.989896 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qd9j4\" (UniqueName: \"kubernetes.io/projected/56c7d5df-a107-481a-b89a-03a16e94f085-kube-api-access-qd9j4\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.989924 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56c7d5df-a107-481a-b89a-03a16e94f085-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.989936 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56c7d5df-a107-481a-b89a-03a16e94f085-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:12 crc kubenswrapper[4825]: I1007 19:17:12.989948 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56c7d5df-a107-481a-b89a-03a16e94f085-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.341304 4825 generic.go:334] "Generic (PLEG): container finished" podID="56c7d5df-a107-481a-b89a-03a16e94f085" containerID="dacca8678e574cc2de925ee561d50499af93f431b0fd7432e6dc307e7ba7437e" exitCode=0 Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.341389 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.341401 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" event={"ID":"56c7d5df-a107-481a-b89a-03a16e94f085","Type":"ContainerDied","Data":"dacca8678e574cc2de925ee561d50499af93f431b0fd7432e6dc307e7ba7437e"} Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.341776 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-zvx7z" event={"ID":"56c7d5df-a107-481a-b89a-03a16e94f085","Type":"ContainerDied","Data":"99445c6f739864c70343084a9ecbad13fd860a44049dc49f0ca7813da071fbbc"} Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.341810 4825 scope.go:117] "RemoveContainer" containerID="dacca8678e574cc2de925ee561d50499af93f431b0fd7432e6dc307e7ba7437e" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.343312 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-9vwm4"] Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.386001 4825 scope.go:117] "RemoveContainer" containerID="a208e8e94956cb910abe7797f4d6d4882d303fd3083ed5b2bed25104b385eff7" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.387629 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-zvx7z"] Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.392862 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-zvx7z"] Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.471964 4825 scope.go:117] "RemoveContainer" containerID="dacca8678e574cc2de925ee561d50499af93f431b0fd7432e6dc307e7ba7437e" Oct 07 19:17:13 crc kubenswrapper[4825]: E1007 19:17:13.472682 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dacca8678e574cc2de925ee561d50499af93f431b0fd7432e6dc307e7ba7437e\": container with ID starting with dacca8678e574cc2de925ee561d50499af93f431b0fd7432e6dc307e7ba7437e not found: ID does not exist" containerID="dacca8678e574cc2de925ee561d50499af93f431b0fd7432e6dc307e7ba7437e" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.472725 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dacca8678e574cc2de925ee561d50499af93f431b0fd7432e6dc307e7ba7437e"} err="failed to get container status \"dacca8678e574cc2de925ee561d50499af93f431b0fd7432e6dc307e7ba7437e\": rpc error: code = NotFound desc = could not find container \"dacca8678e574cc2de925ee561d50499af93f431b0fd7432e6dc307e7ba7437e\": container with ID starting with dacca8678e574cc2de925ee561d50499af93f431b0fd7432e6dc307e7ba7437e not found: ID does not exist" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.472753 4825 scope.go:117] "RemoveContainer" containerID="a208e8e94956cb910abe7797f4d6d4882d303fd3083ed5b2bed25104b385eff7" Oct 07 19:17:13 crc kubenswrapper[4825]: E1007 19:17:13.476208 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a208e8e94956cb910abe7797f4d6d4882d303fd3083ed5b2bed25104b385eff7\": container with ID starting with a208e8e94956cb910abe7797f4d6d4882d303fd3083ed5b2bed25104b385eff7 not found: ID does not exist" containerID="a208e8e94956cb910abe7797f4d6d4882d303fd3083ed5b2bed25104b385eff7" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.476343 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a208e8e94956cb910abe7797f4d6d4882d303fd3083ed5b2bed25104b385eff7"} err="failed to get container status \"a208e8e94956cb910abe7797f4d6d4882d303fd3083ed5b2bed25104b385eff7\": rpc error: code = NotFound desc = could not find container \"a208e8e94956cb910abe7797f4d6d4882d303fd3083ed5b2bed25104b385eff7\": container with ID starting with a208e8e94956cb910abe7797f4d6d4882d303fd3083ed5b2bed25104b385eff7 not found: ID does not exist" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.696419 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 07 19:17:13 crc kubenswrapper[4825]: E1007 19:17:13.697079 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c7d5df-a107-481a-b89a-03a16e94f085" containerName="dnsmasq-dns" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.697092 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c7d5df-a107-481a-b89a-03a16e94f085" containerName="dnsmasq-dns" Oct 07 19:17:13 crc kubenswrapper[4825]: E1007 19:17:13.697123 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c7d5df-a107-481a-b89a-03a16e94f085" containerName="init" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.697129 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c7d5df-a107-481a-b89a-03a16e94f085" containerName="init" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.697285 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c7d5df-a107-481a-b89a-03a16e94f085" containerName="dnsmasq-dns" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.701526 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.703719 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-ffmd4" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.704381 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.704426 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.704806 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.724103 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.804598 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-etc-swift\") pod \"swift-storage-0\" (UID: \"43cb88e3-5a22-4562-86b0-b016c7ff1dcf\") " pod="openstack/swift-storage-0" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.804663 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-lock\") pod \"swift-storage-0\" (UID: \"43cb88e3-5a22-4562-86b0-b016c7ff1dcf\") " pod="openstack/swift-storage-0" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.804697 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"43cb88e3-5a22-4562-86b0-b016c7ff1dcf\") " pod="openstack/swift-storage-0" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.804862 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-cache\") pod \"swift-storage-0\" (UID: \"43cb88e3-5a22-4562-86b0-b016c7ff1dcf\") " pod="openstack/swift-storage-0" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.804948 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrsqx\" (UniqueName: \"kubernetes.io/projected/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-kube-api-access-jrsqx\") pod \"swift-storage-0\" (UID: \"43cb88e3-5a22-4562-86b0-b016c7ff1dcf\") " pod="openstack/swift-storage-0" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.810477 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56c7d5df-a107-481a-b89a-03a16e94f085" path="/var/lib/kubelet/pods/56c7d5df-a107-481a-b89a-03a16e94f085/volumes" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.906584 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-etc-swift\") pod \"swift-storage-0\" (UID: \"43cb88e3-5a22-4562-86b0-b016c7ff1dcf\") " pod="openstack/swift-storage-0" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.906698 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-lock\") pod \"swift-storage-0\" (UID: \"43cb88e3-5a22-4562-86b0-b016c7ff1dcf\") " pod="openstack/swift-storage-0" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.906729 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"43cb88e3-5a22-4562-86b0-b016c7ff1dcf\") " pod="openstack/swift-storage-0" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.906790 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-cache\") pod \"swift-storage-0\" (UID: \"43cb88e3-5a22-4562-86b0-b016c7ff1dcf\") " pod="openstack/swift-storage-0" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.906827 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrsqx\" (UniqueName: \"kubernetes.io/projected/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-kube-api-access-jrsqx\") pod \"swift-storage-0\" (UID: \"43cb88e3-5a22-4562-86b0-b016c7ff1dcf\") " pod="openstack/swift-storage-0" Oct 07 19:17:13 crc kubenswrapper[4825]: E1007 19:17:13.907575 4825 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 19:17:13 crc kubenswrapper[4825]: E1007 19:17:13.907599 4825 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 19:17:13 crc kubenswrapper[4825]: E1007 19:17:13.907645 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-etc-swift podName:43cb88e3-5a22-4562-86b0-b016c7ff1dcf nodeName:}" failed. No retries permitted until 2025-10-07 19:17:14.407626831 +0000 UTC m=+1023.229665478 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-etc-swift") pod "swift-storage-0" (UID: "43cb88e3-5a22-4562-86b0-b016c7ff1dcf") : configmap "swift-ring-files" not found Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.908413 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"43cb88e3-5a22-4562-86b0-b016c7ff1dcf\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/swift-storage-0" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.908608 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-lock\") pod \"swift-storage-0\" (UID: \"43cb88e3-5a22-4562-86b0-b016c7ff1dcf\") " pod="openstack/swift-storage-0" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.910089 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-cache\") pod \"swift-storage-0\" (UID: \"43cb88e3-5a22-4562-86b0-b016c7ff1dcf\") " pod="openstack/swift-storage-0" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.936750 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrsqx\" (UniqueName: \"kubernetes.io/projected/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-kube-api-access-jrsqx\") pod \"swift-storage-0\" (UID: \"43cb88e3-5a22-4562-86b0-b016c7ff1dcf\") " pod="openstack/swift-storage-0" Oct 07 19:17:13 crc kubenswrapper[4825]: I1007 19:17:13.937816 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"43cb88e3-5a22-4562-86b0-b016c7ff1dcf\") " pod="openstack/swift-storage-0" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.234764 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-vqzq5"] Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.236130 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.238978 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.241625 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.241669 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.251855 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vqzq5"] Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.288637 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-vqzq5"] Oct 07 19:17:14 crc kubenswrapper[4825]: E1007 19:17:14.289431 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-9d9j5 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-9d9j5 ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-vqzq5" podUID="e4171668-8bf7-401c-9c7a-250bf0530421" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.295055 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-whwz4"] Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.296493 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.312515 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-whwz4"] Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.313491 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4171668-8bf7-401c-9c7a-250bf0530421-combined-ca-bundle\") pod \"swift-ring-rebalance-vqzq5\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.313636 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13a46859-41a7-4783-9c3d-be9e48db5526-scripts\") pod \"swift-ring-rebalance-whwz4\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.313758 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/13a46859-41a7-4783-9c3d-be9e48db5526-ring-data-devices\") pod \"swift-ring-rebalance-whwz4\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.313877 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4171668-8bf7-401c-9c7a-250bf0530421-scripts\") pod \"swift-ring-rebalance-vqzq5\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.313986 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/13a46859-41a7-4783-9c3d-be9e48db5526-etc-swift\") pod \"swift-ring-rebalance-whwz4\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.314108 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4171668-8bf7-401c-9c7a-250bf0530421-etc-swift\") pod \"swift-ring-rebalance-vqzq5\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.314257 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13a46859-41a7-4783-9c3d-be9e48db5526-combined-ca-bundle\") pod \"swift-ring-rebalance-whwz4\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.314447 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/13a46859-41a7-4783-9c3d-be9e48db5526-dispersionconf\") pod \"swift-ring-rebalance-whwz4\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.314555 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d9j5\" (UniqueName: \"kubernetes.io/projected/e4171668-8bf7-401c-9c7a-250bf0530421-kube-api-access-9d9j5\") pod \"swift-ring-rebalance-vqzq5\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.314658 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/13a46859-41a7-4783-9c3d-be9e48db5526-swiftconf\") pod \"swift-ring-rebalance-whwz4\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.314784 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftmnp\" (UniqueName: \"kubernetes.io/projected/13a46859-41a7-4783-9c3d-be9e48db5526-kube-api-access-ftmnp\") pod \"swift-ring-rebalance-whwz4\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.314924 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4171668-8bf7-401c-9c7a-250bf0530421-ring-data-devices\") pod \"swift-ring-rebalance-vqzq5\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.315040 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4171668-8bf7-401c-9c7a-250bf0530421-dispersionconf\") pod \"swift-ring-rebalance-vqzq5\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.315152 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4171668-8bf7-401c-9c7a-250bf0530421-swiftconf\") pod \"swift-ring-rebalance-vqzq5\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.349788 4825 generic.go:334] "Generic (PLEG): container finished" podID="fa0ba0a4-872f-4ebd-8ee1-0e57174648a9" containerID="fb7feff4407295b1ec22b4c10368bc8c0c0d8826a4d057372c1b923635385321" exitCode=0 Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.349852 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9","Type":"ContainerDied","Data":"fb7feff4407295b1ec22b4c10368bc8c0c0d8826a4d057372c1b923635385321"} Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.353017 4825 generic.go:334] "Generic (PLEG): container finished" podID="41179ae7-4ff4-4c39-81d2-8867a63917e6" containerID="b6d0ee339a8dd3f05512b7bd310db99d28a1a32a79e11ab560300e53815b4c7e" exitCode=0 Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.353048 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-9vwm4" event={"ID":"41179ae7-4ff4-4c39-81d2-8867a63917e6","Type":"ContainerDied","Data":"b6d0ee339a8dd3f05512b7bd310db99d28a1a32a79e11ab560300e53815b4c7e"} Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.353351 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-9vwm4" event={"ID":"41179ae7-4ff4-4c39-81d2-8867a63917e6","Type":"ContainerStarted","Data":"13cff80defa57c1f5a12ec452bdf650ee332ab4e77c95b41779a2d0911e6b085"} Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.354546 4825 generic.go:334] "Generic (PLEG): container finished" podID="6451a8c0-c6b1-4098-846d-24fe8c26d849" containerID="5fad2aaae2dcfda8767d5c76ee1dbab1a50e6b87c5c95d4d5f02145de9b30182" exitCode=0 Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.354584 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.354617 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6451a8c0-c6b1-4098-846d-24fe8c26d849","Type":"ContainerDied","Data":"5fad2aaae2dcfda8767d5c76ee1dbab1a50e6b87c5c95d4d5f02145de9b30182"} Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.363305 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.417032 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4171668-8bf7-401c-9c7a-250bf0530421-ring-data-devices\") pod \"swift-ring-rebalance-vqzq5\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.417098 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4171668-8bf7-401c-9c7a-250bf0530421-dispersionconf\") pod \"swift-ring-rebalance-vqzq5\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.417136 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4171668-8bf7-401c-9c7a-250bf0530421-swiftconf\") pod \"swift-ring-rebalance-vqzq5\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.417168 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4171668-8bf7-401c-9c7a-250bf0530421-combined-ca-bundle\") pod \"swift-ring-rebalance-vqzq5\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.417197 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13a46859-41a7-4783-9c3d-be9e48db5526-scripts\") pod \"swift-ring-rebalance-whwz4\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.417224 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/13a46859-41a7-4783-9c3d-be9e48db5526-ring-data-devices\") pod \"swift-ring-rebalance-whwz4\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.417273 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4171668-8bf7-401c-9c7a-250bf0530421-scripts\") pod \"swift-ring-rebalance-vqzq5\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.417298 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/13a46859-41a7-4783-9c3d-be9e48db5526-etc-swift\") pod \"swift-ring-rebalance-whwz4\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.417332 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4171668-8bf7-401c-9c7a-250bf0530421-etc-swift\") pod \"swift-ring-rebalance-vqzq5\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.417363 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-etc-swift\") pod \"swift-storage-0\" (UID: \"43cb88e3-5a22-4562-86b0-b016c7ff1dcf\") " pod="openstack/swift-storage-0" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.417403 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13a46859-41a7-4783-9c3d-be9e48db5526-combined-ca-bundle\") pod \"swift-ring-rebalance-whwz4\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.417442 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/13a46859-41a7-4783-9c3d-be9e48db5526-dispersionconf\") pod \"swift-ring-rebalance-whwz4\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.417464 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d9j5\" (UniqueName: \"kubernetes.io/projected/e4171668-8bf7-401c-9c7a-250bf0530421-kube-api-access-9d9j5\") pod \"swift-ring-rebalance-vqzq5\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.417486 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/13a46859-41a7-4783-9c3d-be9e48db5526-swiftconf\") pod \"swift-ring-rebalance-whwz4\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.417526 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftmnp\" (UniqueName: \"kubernetes.io/projected/13a46859-41a7-4783-9c3d-be9e48db5526-kube-api-access-ftmnp\") pod \"swift-ring-rebalance-whwz4\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.419080 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4171668-8bf7-401c-9c7a-250bf0530421-scripts\") pod \"swift-ring-rebalance-vqzq5\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.419479 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4171668-8bf7-401c-9c7a-250bf0530421-etc-swift\") pod \"swift-ring-rebalance-vqzq5\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.419698 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4171668-8bf7-401c-9c7a-250bf0530421-ring-data-devices\") pod \"swift-ring-rebalance-vqzq5\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.420043 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/13a46859-41a7-4783-9c3d-be9e48db5526-etc-swift\") pod \"swift-ring-rebalance-whwz4\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.421779 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13a46859-41a7-4783-9c3d-be9e48db5526-scripts\") pod \"swift-ring-rebalance-whwz4\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:14 crc kubenswrapper[4825]: E1007 19:17:14.421886 4825 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 19:17:14 crc kubenswrapper[4825]: E1007 19:17:14.421901 4825 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 19:17:14 crc kubenswrapper[4825]: E1007 19:17:14.421943 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-etc-swift podName:43cb88e3-5a22-4562-86b0-b016c7ff1dcf nodeName:}" failed. No retries permitted until 2025-10-07 19:17:15.42192755 +0000 UTC m=+1024.243966207 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-etc-swift") pod "swift-storage-0" (UID: "43cb88e3-5a22-4562-86b0-b016c7ff1dcf") : configmap "swift-ring-files" not found Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.424304 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/13a46859-41a7-4783-9c3d-be9e48db5526-ring-data-devices\") pod \"swift-ring-rebalance-whwz4\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.428740 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4171668-8bf7-401c-9c7a-250bf0530421-dispersionconf\") pod \"swift-ring-rebalance-vqzq5\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.430048 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/13a46859-41a7-4783-9c3d-be9e48db5526-dispersionconf\") pod \"swift-ring-rebalance-whwz4\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.432868 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/13a46859-41a7-4783-9c3d-be9e48db5526-swiftconf\") pod \"swift-ring-rebalance-whwz4\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.433526 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4171668-8bf7-401c-9c7a-250bf0530421-swiftconf\") pod \"swift-ring-rebalance-vqzq5\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.433623 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13a46859-41a7-4783-9c3d-be9e48db5526-combined-ca-bundle\") pod \"swift-ring-rebalance-whwz4\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.433801 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4171668-8bf7-401c-9c7a-250bf0530421-combined-ca-bundle\") pod \"swift-ring-rebalance-vqzq5\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.441092 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftmnp\" (UniqueName: \"kubernetes.io/projected/13a46859-41a7-4783-9c3d-be9e48db5526-kube-api-access-ftmnp\") pod \"swift-ring-rebalance-whwz4\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.447845 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d9j5\" (UniqueName: \"kubernetes.io/projected/e4171668-8bf7-401c-9c7a-250bf0530421-kube-api-access-9d9j5\") pod \"swift-ring-rebalance-vqzq5\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.518395 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4171668-8bf7-401c-9c7a-250bf0530421-scripts\") pod \"e4171668-8bf7-401c-9c7a-250bf0530421\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.518514 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4171668-8bf7-401c-9c7a-250bf0530421-etc-swift\") pod \"e4171668-8bf7-401c-9c7a-250bf0530421\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.518582 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4171668-8bf7-401c-9c7a-250bf0530421-swiftconf\") pod \"e4171668-8bf7-401c-9c7a-250bf0530421\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.518633 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4171668-8bf7-401c-9c7a-250bf0530421-combined-ca-bundle\") pod \"e4171668-8bf7-401c-9c7a-250bf0530421\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.518732 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4171668-8bf7-401c-9c7a-250bf0530421-ring-data-devices\") pod \"e4171668-8bf7-401c-9c7a-250bf0530421\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.518765 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4171668-8bf7-401c-9c7a-250bf0530421-dispersionconf\") pod \"e4171668-8bf7-401c-9c7a-250bf0530421\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.519257 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4171668-8bf7-401c-9c7a-250bf0530421-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e4171668-8bf7-401c-9c7a-250bf0530421" (UID: "e4171668-8bf7-401c-9c7a-250bf0530421"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.519201 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4171668-8bf7-401c-9c7a-250bf0530421-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e4171668-8bf7-401c-9c7a-250bf0530421" (UID: "e4171668-8bf7-401c-9c7a-250bf0530421"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.519305 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4171668-8bf7-401c-9c7a-250bf0530421-scripts" (OuterVolumeSpecName: "scripts") pod "e4171668-8bf7-401c-9c7a-250bf0530421" (UID: "e4171668-8bf7-401c-9c7a-250bf0530421"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.520355 4825 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4171668-8bf7-401c-9c7a-250bf0530421-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.521129 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4171668-8bf7-401c-9c7a-250bf0530421-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.521142 4825 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4171668-8bf7-401c-9c7a-250bf0530421-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.522972 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4171668-8bf7-401c-9c7a-250bf0530421-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4171668-8bf7-401c-9c7a-250bf0530421" (UID: "e4171668-8bf7-401c-9c7a-250bf0530421"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.523092 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4171668-8bf7-401c-9c7a-250bf0530421-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e4171668-8bf7-401c-9c7a-250bf0530421" (UID: "e4171668-8bf7-401c-9c7a-250bf0530421"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.523218 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4171668-8bf7-401c-9c7a-250bf0530421-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e4171668-8bf7-401c-9c7a-250bf0530421" (UID: "e4171668-8bf7-401c-9c7a-250bf0530421"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.613353 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.622055 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d9j5\" (UniqueName: \"kubernetes.io/projected/e4171668-8bf7-401c-9c7a-250bf0530421-kube-api-access-9d9j5\") pod \"e4171668-8bf7-401c-9c7a-250bf0530421\" (UID: \"e4171668-8bf7-401c-9c7a-250bf0530421\") " Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.622616 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4171668-8bf7-401c-9c7a-250bf0530421-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.622643 4825 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4171668-8bf7-401c-9c7a-250bf0530421-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.622657 4825 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4171668-8bf7-401c-9c7a-250bf0530421-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.627966 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4171668-8bf7-401c-9c7a-250bf0530421-kube-api-access-9d9j5" (OuterVolumeSpecName: "kube-api-access-9d9j5") pod "e4171668-8bf7-401c-9c7a-250bf0530421" (UID: "e4171668-8bf7-401c-9c7a-250bf0530421"). InnerVolumeSpecName "kube-api-access-9d9j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:17:14 crc kubenswrapper[4825]: I1007 19:17:14.724464 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d9j5\" (UniqueName: \"kubernetes.io/projected/e4171668-8bf7-401c-9c7a-250bf0530421-kube-api-access-9d9j5\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:15 crc kubenswrapper[4825]: I1007 19:17:15.055921 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-whwz4"] Oct 07 19:17:15 crc kubenswrapper[4825]: I1007 19:17:15.365090 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-9vwm4" event={"ID":"41179ae7-4ff4-4c39-81d2-8867a63917e6","Type":"ContainerStarted","Data":"fbf4f01494b54defbea82af8c554f6165fc11ad711f24a704b0e02d1c1ebec19"} Oct 07 19:17:15 crc kubenswrapper[4825]: I1007 19:17:15.365210 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-9vwm4" Oct 07 19:17:15 crc kubenswrapper[4825]: I1007 19:17:15.367293 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6451a8c0-c6b1-4098-846d-24fe8c26d849","Type":"ContainerStarted","Data":"2a44c3c075530c848623462190ed00800de23192bcd0db27c2c131902a2738ff"} Oct 07 19:17:15 crc kubenswrapper[4825]: I1007 19:17:15.369777 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-whwz4" event={"ID":"13a46859-41a7-4783-9c3d-be9e48db5526","Type":"ContainerStarted","Data":"658063dff04f6b81357294002cf7219ec4af16fdf2261a48fb4f6cd14e0d1e4b"} Oct 07 19:17:15 crc kubenswrapper[4825]: I1007 19:17:15.371627 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vqzq5" Oct 07 19:17:15 crc kubenswrapper[4825]: I1007 19:17:15.372348 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fa0ba0a4-872f-4ebd-8ee1-0e57174648a9","Type":"ContainerStarted","Data":"98fdd1fda2f5655da1ec786831a7bbf00b98281191205d34b199eea5b9785fd5"} Oct 07 19:17:15 crc kubenswrapper[4825]: I1007 19:17:15.386758 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-9vwm4" podStartSLOduration=3.386736724 podStartE2EDuration="3.386736724s" podCreationTimestamp="2025-10-07 19:17:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:17:15.384460794 +0000 UTC m=+1024.206499431" watchObservedRunningTime="2025-10-07 19:17:15.386736724 +0000 UTC m=+1024.208775361" Oct 07 19:17:15 crc kubenswrapper[4825]: I1007 19:17:15.413491 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=26.378613017 podStartE2EDuration="38.413471638s" podCreationTimestamp="2025-10-07 19:16:37 +0000 UTC" firstStartedPulling="2025-10-07 19:16:49.636342492 +0000 UTC m=+998.458381129" lastFinishedPulling="2025-10-07 19:17:01.671201113 +0000 UTC m=+1010.493239750" observedRunningTime="2025-10-07 19:17:15.404909357 +0000 UTC m=+1024.226948014" watchObservedRunningTime="2025-10-07 19:17:15.413471638 +0000 UTC m=+1024.235510275" Oct 07 19:17:15 crc kubenswrapper[4825]: I1007 19:17:15.427464 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=26.007600806 podStartE2EDuration="37.427448234s" podCreationTimestamp="2025-10-07 19:16:38 +0000 UTC" firstStartedPulling="2025-10-07 19:16:49.786593766 +0000 UTC m=+998.608632403" lastFinishedPulling="2025-10-07 19:17:01.206441194 +0000 UTC m=+1010.028479831" observedRunningTime="2025-10-07 19:17:15.423576295 +0000 UTC m=+1024.245614932" watchObservedRunningTime="2025-10-07 19:17:15.427448234 +0000 UTC m=+1024.249486861" Oct 07 19:17:15 crc kubenswrapper[4825]: I1007 19:17:15.437568 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-etc-swift\") pod \"swift-storage-0\" (UID: \"43cb88e3-5a22-4562-86b0-b016c7ff1dcf\") " pod="openstack/swift-storage-0" Oct 07 19:17:15 crc kubenswrapper[4825]: E1007 19:17:15.437809 4825 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 19:17:15 crc kubenswrapper[4825]: E1007 19:17:15.437850 4825 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 19:17:15 crc kubenswrapper[4825]: E1007 19:17:15.437920 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-etc-swift podName:43cb88e3-5a22-4562-86b0-b016c7ff1dcf nodeName:}" failed. No retries permitted until 2025-10-07 19:17:17.437898171 +0000 UTC m=+1026.259936808 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-etc-swift") pod "swift-storage-0" (UID: "43cb88e3-5a22-4562-86b0-b016c7ff1dcf") : configmap "swift-ring-files" not found Oct 07 19:17:15 crc kubenswrapper[4825]: I1007 19:17:15.460833 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-vqzq5"] Oct 07 19:17:15 crc kubenswrapper[4825]: I1007 19:17:15.469845 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-vqzq5"] Oct 07 19:17:15 crc kubenswrapper[4825]: I1007 19:17:15.813766 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4171668-8bf7-401c-9c7a-250bf0530421" path="/var/lib/kubelet/pods/e4171668-8bf7-401c-9c7a-250bf0530421/volumes" Oct 07 19:17:17 crc kubenswrapper[4825]: I1007 19:17:17.473908 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-etc-swift\") pod \"swift-storage-0\" (UID: \"43cb88e3-5a22-4562-86b0-b016c7ff1dcf\") " pod="openstack/swift-storage-0" Oct 07 19:17:17 crc kubenswrapper[4825]: E1007 19:17:17.474131 4825 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 19:17:17 crc kubenswrapper[4825]: E1007 19:17:17.474359 4825 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 19:17:17 crc kubenswrapper[4825]: E1007 19:17:17.474412 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-etc-swift podName:43cb88e3-5a22-4562-86b0-b016c7ff1dcf nodeName:}" failed. No retries permitted until 2025-10-07 19:17:21.474395752 +0000 UTC m=+1030.296434389 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-etc-swift") pod "swift-storage-0" (UID: "43cb88e3-5a22-4562-86b0-b016c7ff1dcf") : configmap "swift-ring-files" not found Oct 07 19:17:18 crc kubenswrapper[4825]: I1007 19:17:18.395014 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-whwz4" event={"ID":"13a46859-41a7-4783-9c3d-be9e48db5526","Type":"ContainerStarted","Data":"3baa4d678bcf992da7077403f96c01d1d0450e2f451f2ea41d76bf0c1fed77ad"} Oct 07 19:17:18 crc kubenswrapper[4825]: I1007 19:17:18.419751 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-whwz4" podStartSLOduration=1.362966489 podStartE2EDuration="4.419731218s" podCreationTimestamp="2025-10-07 19:17:14 +0000 UTC" firstStartedPulling="2025-10-07 19:17:15.069376471 +0000 UTC m=+1023.891415108" lastFinishedPulling="2025-10-07 19:17:18.12614117 +0000 UTC m=+1026.948179837" observedRunningTime="2025-10-07 19:17:18.41762924 +0000 UTC m=+1027.239667887" watchObservedRunningTime="2025-10-07 19:17:18.419731218 +0000 UTC m=+1027.241769865" Oct 07 19:17:18 crc kubenswrapper[4825]: E1007 19:17:18.735065 4825 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.45:44016->38.102.83.45:42153: write tcp 38.102.83.45:44016->38.102.83.45:42153: write: broken pipe Oct 07 19:17:19 crc kubenswrapper[4825]: I1007 19:17:19.202742 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 07 19:17:19 crc kubenswrapper[4825]: I1007 19:17:19.202816 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 07 19:17:20 crc kubenswrapper[4825]: I1007 19:17:20.035698 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 07 19:17:20 crc kubenswrapper[4825]: I1007 19:17:20.036260 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 07 19:17:21 crc kubenswrapper[4825]: I1007 19:17:21.273810 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 07 19:17:21 crc kubenswrapper[4825]: I1007 19:17:21.330524 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 07 19:17:21 crc kubenswrapper[4825]: I1007 19:17:21.562807 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-etc-swift\") pod \"swift-storage-0\" (UID: \"43cb88e3-5a22-4562-86b0-b016c7ff1dcf\") " pod="openstack/swift-storage-0" Oct 07 19:17:21 crc kubenswrapper[4825]: E1007 19:17:21.563037 4825 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 19:17:21 crc kubenswrapper[4825]: E1007 19:17:21.563074 4825 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 19:17:21 crc kubenswrapper[4825]: E1007 19:17:21.563134 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-etc-swift podName:43cb88e3-5a22-4562-86b0-b016c7ff1dcf nodeName:}" failed. No retries permitted until 2025-10-07 19:17:29.563112251 +0000 UTC m=+1038.385150938 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-etc-swift") pod "swift-storage-0" (UID: "43cb88e3-5a22-4562-86b0-b016c7ff1dcf") : configmap "swift-ring-files" not found Oct 07 19:17:22 crc kubenswrapper[4825]: I1007 19:17:22.878615 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-9vwm4" Oct 07 19:17:22 crc kubenswrapper[4825]: I1007 19:17:22.969904 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-qgxph"] Oct 07 19:17:22 crc kubenswrapper[4825]: I1007 19:17:22.970132 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" podUID="ad9695e9-6d2d-42a0-85fe-a11f50c148e4" containerName="dnsmasq-dns" containerID="cri-o://436f6f20e36edd30a0a1b9e7e2e46cd400abd457c5e02ee096f451104d9fb5d5" gracePeriod=10 Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.407400 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.438350 4825 generic.go:334] "Generic (PLEG): container finished" podID="ad9695e9-6d2d-42a0-85fe-a11f50c148e4" containerID="436f6f20e36edd30a0a1b9e7e2e46cd400abd457c5e02ee096f451104d9fb5d5" exitCode=0 Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.438392 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" event={"ID":"ad9695e9-6d2d-42a0-85fe-a11f50c148e4","Type":"ContainerDied","Data":"436f6f20e36edd30a0a1b9e7e2e46cd400abd457c5e02ee096f451104d9fb5d5"} Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.438427 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.438453 4825 scope.go:117] "RemoveContainer" containerID="436f6f20e36edd30a0a1b9e7e2e46cd400abd457c5e02ee096f451104d9fb5d5" Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.438438 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-qgxph" event={"ID":"ad9695e9-6d2d-42a0-85fe-a11f50c148e4","Type":"ContainerDied","Data":"9af90fa9094b01967cdde759ed0e5597e44ecdef86276c27bc5f619a3f3ee7db"} Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.456641 4825 scope.go:117] "RemoveContainer" containerID="7e59d8dcf7dafd81bb2486a694edef9415b0078259a26c493d5b6d2c90792c75" Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.487396 4825 scope.go:117] "RemoveContainer" containerID="436f6f20e36edd30a0a1b9e7e2e46cd400abd457c5e02ee096f451104d9fb5d5" Oct 07 19:17:23 crc kubenswrapper[4825]: E1007 19:17:23.487932 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"436f6f20e36edd30a0a1b9e7e2e46cd400abd457c5e02ee096f451104d9fb5d5\": container with ID starting with 436f6f20e36edd30a0a1b9e7e2e46cd400abd457c5e02ee096f451104d9fb5d5 not found: ID does not exist" containerID="436f6f20e36edd30a0a1b9e7e2e46cd400abd457c5e02ee096f451104d9fb5d5" Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.487995 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"436f6f20e36edd30a0a1b9e7e2e46cd400abd457c5e02ee096f451104d9fb5d5"} err="failed to get container status \"436f6f20e36edd30a0a1b9e7e2e46cd400abd457c5e02ee096f451104d9fb5d5\": rpc error: code = NotFound desc = could not find container \"436f6f20e36edd30a0a1b9e7e2e46cd400abd457c5e02ee096f451104d9fb5d5\": container with ID starting with 436f6f20e36edd30a0a1b9e7e2e46cd400abd457c5e02ee096f451104d9fb5d5 not found: ID does not exist" Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.488028 4825 scope.go:117] "RemoveContainer" containerID="7e59d8dcf7dafd81bb2486a694edef9415b0078259a26c493d5b6d2c90792c75" Oct 07 19:17:23 crc kubenswrapper[4825]: E1007 19:17:23.488382 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e59d8dcf7dafd81bb2486a694edef9415b0078259a26c493d5b6d2c90792c75\": container with ID starting with 7e59d8dcf7dafd81bb2486a694edef9415b0078259a26c493d5b6d2c90792c75 not found: ID does not exist" containerID="7e59d8dcf7dafd81bb2486a694edef9415b0078259a26c493d5b6d2c90792c75" Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.488424 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e59d8dcf7dafd81bb2486a694edef9415b0078259a26c493d5b6d2c90792c75"} err="failed to get container status \"7e59d8dcf7dafd81bb2486a694edef9415b0078259a26c493d5b6d2c90792c75\": rpc error: code = NotFound desc = could not find container \"7e59d8dcf7dafd81bb2486a694edef9415b0078259a26c493d5b6d2c90792c75\": container with ID starting with 7e59d8dcf7dafd81bb2486a694edef9415b0078259a26c493d5b6d2c90792c75 not found: ID does not exist" Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.507933 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-ovsdbserver-sb\") pod \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\" (UID: \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\") " Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.508118 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-dns-svc\") pod \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\" (UID: \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\") " Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.508209 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h455\" (UniqueName: \"kubernetes.io/projected/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-kube-api-access-2h455\") pod \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\" (UID: \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\") " Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.508315 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-config\") pod \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\" (UID: \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\") " Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.508407 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-ovsdbserver-nb\") pod \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\" (UID: \"ad9695e9-6d2d-42a0-85fe-a11f50c148e4\") " Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.516902 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-kube-api-access-2h455" (OuterVolumeSpecName: "kube-api-access-2h455") pod "ad9695e9-6d2d-42a0-85fe-a11f50c148e4" (UID: "ad9695e9-6d2d-42a0-85fe-a11f50c148e4"). InnerVolumeSpecName "kube-api-access-2h455". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.551606 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ad9695e9-6d2d-42a0-85fe-a11f50c148e4" (UID: "ad9695e9-6d2d-42a0-85fe-a11f50c148e4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.551648 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ad9695e9-6d2d-42a0-85fe-a11f50c148e4" (UID: "ad9695e9-6d2d-42a0-85fe-a11f50c148e4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.563429 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ad9695e9-6d2d-42a0-85fe-a11f50c148e4" (UID: "ad9695e9-6d2d-42a0-85fe-a11f50c148e4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.571361 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-config" (OuterVolumeSpecName: "config") pod "ad9695e9-6d2d-42a0-85fe-a11f50c148e4" (UID: "ad9695e9-6d2d-42a0-85fe-a11f50c148e4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.610885 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.611170 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.611289 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h455\" (UniqueName: \"kubernetes.io/projected/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-kube-api-access-2h455\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.611369 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.611443 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad9695e9-6d2d-42a0-85fe-a11f50c148e4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.769375 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-qgxph"] Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.776978 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-qgxph"] Oct 07 19:17:23 crc kubenswrapper[4825]: I1007 19:17:23.805998 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad9695e9-6d2d-42a0-85fe-a11f50c148e4" path="/var/lib/kubelet/pods/ad9695e9-6d2d-42a0-85fe-a11f50c148e4/volumes" Oct 07 19:17:24 crc kubenswrapper[4825]: I1007 19:17:24.128785 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 07 19:17:24 crc kubenswrapper[4825]: I1007 19:17:24.189886 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 07 19:17:25 crc kubenswrapper[4825]: I1007 19:17:25.092570 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 07 19:17:25 crc kubenswrapper[4825]: I1007 19:17:25.464534 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-whwz4" event={"ID":"13a46859-41a7-4783-9c3d-be9e48db5526","Type":"ContainerDied","Data":"3baa4d678bcf992da7077403f96c01d1d0450e2f451f2ea41d76bf0c1fed77ad"} Oct 07 19:17:25 crc kubenswrapper[4825]: I1007 19:17:25.464459 4825 generic.go:334] "Generic (PLEG): container finished" podID="13a46859-41a7-4783-9c3d-be9e48db5526" containerID="3baa4d678bcf992da7077403f96c01d1d0450e2f451f2ea41d76bf0c1fed77ad" exitCode=0 Oct 07 19:17:26 crc kubenswrapper[4825]: I1007 19:17:26.888918 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:26 crc kubenswrapper[4825]: I1007 19:17:26.981012 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftmnp\" (UniqueName: \"kubernetes.io/projected/13a46859-41a7-4783-9c3d-be9e48db5526-kube-api-access-ftmnp\") pod \"13a46859-41a7-4783-9c3d-be9e48db5526\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " Oct 07 19:17:26 crc kubenswrapper[4825]: I1007 19:17:26.981086 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/13a46859-41a7-4783-9c3d-be9e48db5526-etc-swift\") pod \"13a46859-41a7-4783-9c3d-be9e48db5526\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " Oct 07 19:17:26 crc kubenswrapper[4825]: I1007 19:17:26.981133 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/13a46859-41a7-4783-9c3d-be9e48db5526-ring-data-devices\") pod \"13a46859-41a7-4783-9c3d-be9e48db5526\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " Oct 07 19:17:26 crc kubenswrapper[4825]: I1007 19:17:26.981167 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/13a46859-41a7-4783-9c3d-be9e48db5526-swiftconf\") pod \"13a46859-41a7-4783-9c3d-be9e48db5526\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " Oct 07 19:17:26 crc kubenswrapper[4825]: I1007 19:17:26.981194 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/13a46859-41a7-4783-9c3d-be9e48db5526-dispersionconf\") pod \"13a46859-41a7-4783-9c3d-be9e48db5526\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " Oct 07 19:17:26 crc kubenswrapper[4825]: I1007 19:17:26.981263 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13a46859-41a7-4783-9c3d-be9e48db5526-scripts\") pod \"13a46859-41a7-4783-9c3d-be9e48db5526\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " Oct 07 19:17:26 crc kubenswrapper[4825]: I1007 19:17:26.981327 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13a46859-41a7-4783-9c3d-be9e48db5526-combined-ca-bundle\") pod \"13a46859-41a7-4783-9c3d-be9e48db5526\" (UID: \"13a46859-41a7-4783-9c3d-be9e48db5526\") " Oct 07 19:17:26 crc kubenswrapper[4825]: I1007 19:17:26.981988 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13a46859-41a7-4783-9c3d-be9e48db5526-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "13a46859-41a7-4783-9c3d-be9e48db5526" (UID: "13a46859-41a7-4783-9c3d-be9e48db5526"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:17:26 crc kubenswrapper[4825]: I1007 19:17:26.982114 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a46859-41a7-4783-9c3d-be9e48db5526-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "13a46859-41a7-4783-9c3d-be9e48db5526" (UID: "13a46859-41a7-4783-9c3d-be9e48db5526"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:17:26 crc kubenswrapper[4825]: I1007 19:17:26.982366 4825 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/13a46859-41a7-4783-9c3d-be9e48db5526-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:26 crc kubenswrapper[4825]: I1007 19:17:26.982385 4825 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/13a46859-41a7-4783-9c3d-be9e48db5526-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:26 crc kubenswrapper[4825]: I1007 19:17:26.987111 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a46859-41a7-4783-9c3d-be9e48db5526-kube-api-access-ftmnp" (OuterVolumeSpecName: "kube-api-access-ftmnp") pod "13a46859-41a7-4783-9c3d-be9e48db5526" (UID: "13a46859-41a7-4783-9c3d-be9e48db5526"). InnerVolumeSpecName "kube-api-access-ftmnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:17:26 crc kubenswrapper[4825]: I1007 19:17:26.989615 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13a46859-41a7-4783-9c3d-be9e48db5526-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "13a46859-41a7-4783-9c3d-be9e48db5526" (UID: "13a46859-41a7-4783-9c3d-be9e48db5526"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:17:27 crc kubenswrapper[4825]: I1007 19:17:27.003902 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13a46859-41a7-4783-9c3d-be9e48db5526-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13a46859-41a7-4783-9c3d-be9e48db5526" (UID: "13a46859-41a7-4783-9c3d-be9e48db5526"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:17:27 crc kubenswrapper[4825]: I1007 19:17:27.006288 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13a46859-41a7-4783-9c3d-be9e48db5526-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "13a46859-41a7-4783-9c3d-be9e48db5526" (UID: "13a46859-41a7-4783-9c3d-be9e48db5526"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:17:27 crc kubenswrapper[4825]: I1007 19:17:27.008729 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13a46859-41a7-4783-9c3d-be9e48db5526-scripts" (OuterVolumeSpecName: "scripts") pod "13a46859-41a7-4783-9c3d-be9e48db5526" (UID: "13a46859-41a7-4783-9c3d-be9e48db5526"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:17:27 crc kubenswrapper[4825]: I1007 19:17:27.084293 4825 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/13a46859-41a7-4783-9c3d-be9e48db5526-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:27 crc kubenswrapper[4825]: I1007 19:17:27.084346 4825 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/13a46859-41a7-4783-9c3d-be9e48db5526-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:27 crc kubenswrapper[4825]: I1007 19:17:27.084369 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13a46859-41a7-4783-9c3d-be9e48db5526-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:27 crc kubenswrapper[4825]: I1007 19:17:27.084387 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13a46859-41a7-4783-9c3d-be9e48db5526-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:27 crc kubenswrapper[4825]: I1007 19:17:27.084407 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftmnp\" (UniqueName: \"kubernetes.io/projected/13a46859-41a7-4783-9c3d-be9e48db5526-kube-api-access-ftmnp\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:27 crc kubenswrapper[4825]: I1007 19:17:27.486395 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-whwz4" event={"ID":"13a46859-41a7-4783-9c3d-be9e48db5526","Type":"ContainerDied","Data":"658063dff04f6b81357294002cf7219ec4af16fdf2261a48fb4f6cd14e0d1e4b"} Oct 07 19:17:27 crc kubenswrapper[4825]: I1007 19:17:27.486434 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="658063dff04f6b81357294002cf7219ec4af16fdf2261a48fb4f6cd14e0d1e4b" Oct 07 19:17:27 crc kubenswrapper[4825]: I1007 19:17:27.486487 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-whwz4" Oct 07 19:17:29 crc kubenswrapper[4825]: I1007 19:17:29.645457 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-etc-swift\") pod \"swift-storage-0\" (UID: \"43cb88e3-5a22-4562-86b0-b016c7ff1dcf\") " pod="openstack/swift-storage-0" Oct 07 19:17:29 crc kubenswrapper[4825]: I1007 19:17:29.659722 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/43cb88e3-5a22-4562-86b0-b016c7ff1dcf-etc-swift\") pod \"swift-storage-0\" (UID: \"43cb88e3-5a22-4562-86b0-b016c7ff1dcf\") " pod="openstack/swift-storage-0" Oct 07 19:17:29 crc kubenswrapper[4825]: I1007 19:17:29.929010 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.279150 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-dh8kx"] Oct 07 19:17:30 crc kubenswrapper[4825]: E1007 19:17:30.279728 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad9695e9-6d2d-42a0-85fe-a11f50c148e4" containerName="dnsmasq-dns" Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.279744 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad9695e9-6d2d-42a0-85fe-a11f50c148e4" containerName="dnsmasq-dns" Oct 07 19:17:30 crc kubenswrapper[4825]: E1007 19:17:30.279770 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a46859-41a7-4783-9c3d-be9e48db5526" containerName="swift-ring-rebalance" Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.279777 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a46859-41a7-4783-9c3d-be9e48db5526" containerName="swift-ring-rebalance" Oct 07 19:17:30 crc kubenswrapper[4825]: E1007 19:17:30.279799 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad9695e9-6d2d-42a0-85fe-a11f50c148e4" containerName="init" Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.279808 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad9695e9-6d2d-42a0-85fe-a11f50c148e4" containerName="init" Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.279955 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad9695e9-6d2d-42a0-85fe-a11f50c148e4" containerName="dnsmasq-dns" Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.279965 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a46859-41a7-4783-9c3d-be9e48db5526" containerName="swift-ring-rebalance" Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.280524 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dh8kx" Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.286160 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-dh8kx"] Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.339750 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.368519 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxzs7\" (UniqueName: \"kubernetes.io/projected/e295199f-c701-41c4-a4a2-4cd8a1897681-kube-api-access-fxzs7\") pod \"keystone-db-create-dh8kx\" (UID: \"e295199f-c701-41c4-a4a2-4cd8a1897681\") " pod="openstack/keystone-db-create-dh8kx" Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.459483 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-dbtvh"] Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.460815 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dbtvh" Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.469882 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxzs7\" (UniqueName: \"kubernetes.io/projected/e295199f-c701-41c4-a4a2-4cd8a1897681-kube-api-access-fxzs7\") pod \"keystone-db-create-dh8kx\" (UID: \"e295199f-c701-41c4-a4a2-4cd8a1897681\") " pod="openstack/keystone-db-create-dh8kx" Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.476748 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-dbtvh"] Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.490059 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxzs7\" (UniqueName: \"kubernetes.io/projected/e295199f-c701-41c4-a4a2-4cd8a1897681-kube-api-access-fxzs7\") pod \"keystone-db-create-dh8kx\" (UID: \"e295199f-c701-41c4-a4a2-4cd8a1897681\") " pod="openstack/keystone-db-create-dh8kx" Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.518977 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"43cb88e3-5a22-4562-86b0-b016c7ff1dcf","Type":"ContainerStarted","Data":"1a48a2964448beb402455120471852cc351c250385f76f1c0b13f08ae78ec9b7"} Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.571697 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvk7t\" (UniqueName: \"kubernetes.io/projected/d69393f2-e05f-41e8-89ca-a8aa9717edf1-kube-api-access-vvk7t\") pod \"placement-db-create-dbtvh\" (UID: \"d69393f2-e05f-41e8-89ca-a8aa9717edf1\") " pod="openstack/placement-db-create-dbtvh" Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.603042 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dh8kx" Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.674446 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvk7t\" (UniqueName: \"kubernetes.io/projected/d69393f2-e05f-41e8-89ca-a8aa9717edf1-kube-api-access-vvk7t\") pod \"placement-db-create-dbtvh\" (UID: \"d69393f2-e05f-41e8-89ca-a8aa9717edf1\") " pod="openstack/placement-db-create-dbtvh" Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.703684 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvk7t\" (UniqueName: \"kubernetes.io/projected/d69393f2-e05f-41e8-89ca-a8aa9717edf1-kube-api-access-vvk7t\") pod \"placement-db-create-dbtvh\" (UID: \"d69393f2-e05f-41e8-89ca-a8aa9717edf1\") " pod="openstack/placement-db-create-dbtvh" Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.712557 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-ljvxn"] Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.714137 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ljvxn" Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.722410 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ljvxn"] Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.779794 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dbtvh" Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.780212 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5bmv\" (UniqueName: \"kubernetes.io/projected/005059cf-fde6-46af-9e47-def8362671af-kube-api-access-h5bmv\") pod \"glance-db-create-ljvxn\" (UID: \"005059cf-fde6-46af-9e47-def8362671af\") " pod="openstack/glance-db-create-ljvxn" Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.882286 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5bmv\" (UniqueName: \"kubernetes.io/projected/005059cf-fde6-46af-9e47-def8362671af-kube-api-access-h5bmv\") pod \"glance-db-create-ljvxn\" (UID: \"005059cf-fde6-46af-9e47-def8362671af\") " pod="openstack/glance-db-create-ljvxn" Oct 07 19:17:30 crc kubenswrapper[4825]: I1007 19:17:30.919955 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5bmv\" (UniqueName: \"kubernetes.io/projected/005059cf-fde6-46af-9e47-def8362671af-kube-api-access-h5bmv\") pod \"glance-db-create-ljvxn\" (UID: \"005059cf-fde6-46af-9e47-def8362671af\") " pod="openstack/glance-db-create-ljvxn" Oct 07 19:17:31 crc kubenswrapper[4825]: I1007 19:17:31.099967 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ljvxn" Oct 07 19:17:31 crc kubenswrapper[4825]: W1007 19:17:31.128786 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode295199f_c701_41c4_a4a2_4cd8a1897681.slice/crio-279b735960e54ed59bc4a41e044fef17f1f418cbd78d7af32967755531f1aaa2 WatchSource:0}: Error finding container 279b735960e54ed59bc4a41e044fef17f1f418cbd78d7af32967755531f1aaa2: Status 404 returned error can't find the container with id 279b735960e54ed59bc4a41e044fef17f1f418cbd78d7af32967755531f1aaa2 Oct 07 19:17:31 crc kubenswrapper[4825]: I1007 19:17:31.129752 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-dh8kx"] Oct 07 19:17:31 crc kubenswrapper[4825]: I1007 19:17:31.225112 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-dbtvh"] Oct 07 19:17:31 crc kubenswrapper[4825]: W1007 19:17:31.236564 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd69393f2_e05f_41e8_89ca_a8aa9717edf1.slice/crio-cbf1bd2326527819329c99640c3f1a9396841e4474df5adad5b2fa5eef0650d6 WatchSource:0}: Error finding container cbf1bd2326527819329c99640c3f1a9396841e4474df5adad5b2fa5eef0650d6: Status 404 returned error can't find the container with id cbf1bd2326527819329c99640c3f1a9396841e4474df5adad5b2fa5eef0650d6 Oct 07 19:17:31 crc kubenswrapper[4825]: I1007 19:17:31.535760 4825 generic.go:334] "Generic (PLEG): container finished" podID="e295199f-c701-41c4-a4a2-4cd8a1897681" containerID="ea01e24da72514c65848fb39b7e2e7ad1508bdb376f291a214d63d4264607f7f" exitCode=0 Oct 07 19:17:31 crc kubenswrapper[4825]: I1007 19:17:31.535829 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dh8kx" event={"ID":"e295199f-c701-41c4-a4a2-4cd8a1897681","Type":"ContainerDied","Data":"ea01e24da72514c65848fb39b7e2e7ad1508bdb376f291a214d63d4264607f7f"} Oct 07 19:17:31 crc kubenswrapper[4825]: I1007 19:17:31.535876 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dh8kx" event={"ID":"e295199f-c701-41c4-a4a2-4cd8a1897681","Type":"ContainerStarted","Data":"279b735960e54ed59bc4a41e044fef17f1f418cbd78d7af32967755531f1aaa2"} Oct 07 19:17:31 crc kubenswrapper[4825]: I1007 19:17:31.540102 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dbtvh" event={"ID":"d69393f2-e05f-41e8-89ca-a8aa9717edf1","Type":"ContainerStarted","Data":"cbf1bd2326527819329c99640c3f1a9396841e4474df5adad5b2fa5eef0650d6"} Oct 07 19:17:31 crc kubenswrapper[4825]: I1007 19:17:31.541095 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ljvxn"] Oct 07 19:17:31 crc kubenswrapper[4825]: W1007 19:17:31.561922 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod005059cf_fde6_46af_9e47_def8362671af.slice/crio-30ae1698b550691abe00662f0756c5c581e67c55b99b4e521605ac5a1b673eff WatchSource:0}: Error finding container 30ae1698b550691abe00662f0756c5c581e67c55b99b4e521605ac5a1b673eff: Status 404 returned error can't find the container with id 30ae1698b550691abe00662f0756c5c581e67c55b99b4e521605ac5a1b673eff Oct 07 19:17:32 crc kubenswrapper[4825]: I1007 19:17:32.553720 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"43cb88e3-5a22-4562-86b0-b016c7ff1dcf","Type":"ContainerStarted","Data":"cdcd0c82b923381b035abce3aaa08b97601b5bda36fd787a3f19974908e8a2b1"} Oct 07 19:17:32 crc kubenswrapper[4825]: I1007 19:17:32.554140 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"43cb88e3-5a22-4562-86b0-b016c7ff1dcf","Type":"ContainerStarted","Data":"9634b9e793ee16e9a017ad4d9f9dfa9b197c4964476fd6b927cd037a3f2fe1b3"} Oct 07 19:17:32 crc kubenswrapper[4825]: I1007 19:17:32.556261 4825 generic.go:334] "Generic (PLEG): container finished" podID="005059cf-fde6-46af-9e47-def8362671af" containerID="cc8b159a89d0eff39eceec9651706693ffcca0a14ba51fbc537e8e502bc47306" exitCode=0 Oct 07 19:17:32 crc kubenswrapper[4825]: I1007 19:17:32.556338 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ljvxn" event={"ID":"005059cf-fde6-46af-9e47-def8362671af","Type":"ContainerDied","Data":"cc8b159a89d0eff39eceec9651706693ffcca0a14ba51fbc537e8e502bc47306"} Oct 07 19:17:32 crc kubenswrapper[4825]: I1007 19:17:32.556367 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ljvxn" event={"ID":"005059cf-fde6-46af-9e47-def8362671af","Type":"ContainerStarted","Data":"30ae1698b550691abe00662f0756c5c581e67c55b99b4e521605ac5a1b673eff"} Oct 07 19:17:32 crc kubenswrapper[4825]: I1007 19:17:32.558378 4825 generic.go:334] "Generic (PLEG): container finished" podID="d69393f2-e05f-41e8-89ca-a8aa9717edf1" containerID="336720f4ec97f4c6f4a84147a1bb3dd9b49938348276b7bfd58a4cb20b68e31e" exitCode=0 Oct 07 19:17:32 crc kubenswrapper[4825]: I1007 19:17:32.558441 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dbtvh" event={"ID":"d69393f2-e05f-41e8-89ca-a8aa9717edf1","Type":"ContainerDied","Data":"336720f4ec97f4c6f4a84147a1bb3dd9b49938348276b7bfd58a4cb20b68e31e"} Oct 07 19:17:32 crc kubenswrapper[4825]: I1007 19:17:32.982181 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dh8kx" Oct 07 19:17:33 crc kubenswrapper[4825]: I1007 19:17:33.069505 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxzs7\" (UniqueName: \"kubernetes.io/projected/e295199f-c701-41c4-a4a2-4cd8a1897681-kube-api-access-fxzs7\") pod \"e295199f-c701-41c4-a4a2-4cd8a1897681\" (UID: \"e295199f-c701-41c4-a4a2-4cd8a1897681\") " Oct 07 19:17:33 crc kubenswrapper[4825]: I1007 19:17:33.076471 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e295199f-c701-41c4-a4a2-4cd8a1897681-kube-api-access-fxzs7" (OuterVolumeSpecName: "kube-api-access-fxzs7") pod "e295199f-c701-41c4-a4a2-4cd8a1897681" (UID: "e295199f-c701-41c4-a4a2-4cd8a1897681"). InnerVolumeSpecName "kube-api-access-fxzs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:17:33 crc kubenswrapper[4825]: I1007 19:17:33.171261 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxzs7\" (UniqueName: \"kubernetes.io/projected/e295199f-c701-41c4-a4a2-4cd8a1897681-kube-api-access-fxzs7\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:33 crc kubenswrapper[4825]: I1007 19:17:33.572889 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"43cb88e3-5a22-4562-86b0-b016c7ff1dcf","Type":"ContainerStarted","Data":"789e92ea7ab66356e3eb9a61d3d72f47e989159a5b6f157e2a10965087b55de5"} Oct 07 19:17:33 crc kubenswrapper[4825]: I1007 19:17:33.573302 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"43cb88e3-5a22-4562-86b0-b016c7ff1dcf","Type":"ContainerStarted","Data":"5b3217301a03fd0fdfad0c8548ac2ededd1e88609fc0bb623c61455d0e7491f6"} Oct 07 19:17:33 crc kubenswrapper[4825]: I1007 19:17:33.575534 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dh8kx" event={"ID":"e295199f-c701-41c4-a4a2-4cd8a1897681","Type":"ContainerDied","Data":"279b735960e54ed59bc4a41e044fef17f1f418cbd78d7af32967755531f1aaa2"} Oct 07 19:17:33 crc kubenswrapper[4825]: I1007 19:17:33.575606 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="279b735960e54ed59bc4a41e044fef17f1f418cbd78d7af32967755531f1aaa2" Oct 07 19:17:33 crc kubenswrapper[4825]: I1007 19:17:33.575729 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dh8kx" Oct 07 19:17:33 crc kubenswrapper[4825]: I1007 19:17:33.966586 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dbtvh" Oct 07 19:17:33 crc kubenswrapper[4825]: I1007 19:17:33.985202 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ljvxn" Oct 07 19:17:34 crc kubenswrapper[4825]: I1007 19:17:34.087488 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5bmv\" (UniqueName: \"kubernetes.io/projected/005059cf-fde6-46af-9e47-def8362671af-kube-api-access-h5bmv\") pod \"005059cf-fde6-46af-9e47-def8362671af\" (UID: \"005059cf-fde6-46af-9e47-def8362671af\") " Oct 07 19:17:34 crc kubenswrapper[4825]: I1007 19:17:34.087600 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvk7t\" (UniqueName: \"kubernetes.io/projected/d69393f2-e05f-41e8-89ca-a8aa9717edf1-kube-api-access-vvk7t\") pod \"d69393f2-e05f-41e8-89ca-a8aa9717edf1\" (UID: \"d69393f2-e05f-41e8-89ca-a8aa9717edf1\") " Oct 07 19:17:34 crc kubenswrapper[4825]: I1007 19:17:34.093203 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/005059cf-fde6-46af-9e47-def8362671af-kube-api-access-h5bmv" (OuterVolumeSpecName: "kube-api-access-h5bmv") pod "005059cf-fde6-46af-9e47-def8362671af" (UID: "005059cf-fde6-46af-9e47-def8362671af"). InnerVolumeSpecName "kube-api-access-h5bmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:17:34 crc kubenswrapper[4825]: I1007 19:17:34.094057 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d69393f2-e05f-41e8-89ca-a8aa9717edf1-kube-api-access-vvk7t" (OuterVolumeSpecName: "kube-api-access-vvk7t") pod "d69393f2-e05f-41e8-89ca-a8aa9717edf1" (UID: "d69393f2-e05f-41e8-89ca-a8aa9717edf1"). InnerVolumeSpecName "kube-api-access-vvk7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:17:34 crc kubenswrapper[4825]: I1007 19:17:34.189847 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5bmv\" (UniqueName: \"kubernetes.io/projected/005059cf-fde6-46af-9e47-def8362671af-kube-api-access-h5bmv\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:34 crc kubenswrapper[4825]: I1007 19:17:34.189884 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvk7t\" (UniqueName: \"kubernetes.io/projected/d69393f2-e05f-41e8-89ca-a8aa9717edf1-kube-api-access-vvk7t\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:34 crc kubenswrapper[4825]: I1007 19:17:34.590255 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ljvxn" event={"ID":"005059cf-fde6-46af-9e47-def8362671af","Type":"ContainerDied","Data":"30ae1698b550691abe00662f0756c5c581e67c55b99b4e521605ac5a1b673eff"} Oct 07 19:17:34 crc kubenswrapper[4825]: I1007 19:17:34.590317 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30ae1698b550691abe00662f0756c5c581e67c55b99b4e521605ac5a1b673eff" Oct 07 19:17:34 crc kubenswrapper[4825]: I1007 19:17:34.590270 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ljvxn" Oct 07 19:17:34 crc kubenswrapper[4825]: I1007 19:17:34.592824 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dbtvh" event={"ID":"d69393f2-e05f-41e8-89ca-a8aa9717edf1","Type":"ContainerDied","Data":"cbf1bd2326527819329c99640c3f1a9396841e4474df5adad5b2fa5eef0650d6"} Oct 07 19:17:34 crc kubenswrapper[4825]: I1007 19:17:34.592886 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dbtvh" Oct 07 19:17:34 crc kubenswrapper[4825]: I1007 19:17:34.592891 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbf1bd2326527819329c99640c3f1a9396841e4474df5adad5b2fa5eef0650d6" Oct 07 19:17:35 crc kubenswrapper[4825]: I1007 19:17:35.603449 4825 generic.go:334] "Generic (PLEG): container finished" podID="f43a8cb5-b546-476e-a429-12947216e9b0" containerID="e7089db7df0abbff50c6bfa62c1f82416a51470cf96c4044c29f1c4a871a3adc" exitCode=0 Oct 07 19:17:35 crc kubenswrapper[4825]: I1007 19:17:35.603506 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f43a8cb5-b546-476e-a429-12947216e9b0","Type":"ContainerDied","Data":"e7089db7df0abbff50c6bfa62c1f82416a51470cf96c4044c29f1c4a871a3adc"} Oct 07 19:17:35 crc kubenswrapper[4825]: I1007 19:17:35.606363 4825 generic.go:334] "Generic (PLEG): container finished" podID="19bd5f67-ab1b-4816-8e44-f792ea626299" containerID="1e33937e19cfcaed27b27c9c08b402c1fd53714ac47cb052f08e88906bd93bad" exitCode=0 Oct 07 19:17:35 crc kubenswrapper[4825]: I1007 19:17:35.606553 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"19bd5f67-ab1b-4816-8e44-f792ea626299","Type":"ContainerDied","Data":"1e33937e19cfcaed27b27c9c08b402c1fd53714ac47cb052f08e88906bd93bad"} Oct 07 19:17:35 crc kubenswrapper[4825]: I1007 19:17:35.615717 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"43cb88e3-5a22-4562-86b0-b016c7ff1dcf","Type":"ContainerStarted","Data":"e67ab545c738ea27576e2643626101ad24956848ff99bf465112c6a639ad1a26"} Oct 07 19:17:35 crc kubenswrapper[4825]: I1007 19:17:35.615772 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"43cb88e3-5a22-4562-86b0-b016c7ff1dcf","Type":"ContainerStarted","Data":"f4f901fc6fb1ec2fe4028019b729566c657964277d7969dfb389858efae396dd"} Oct 07 19:17:35 crc kubenswrapper[4825]: I1007 19:17:35.615787 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"43cb88e3-5a22-4562-86b0-b016c7ff1dcf","Type":"ContainerStarted","Data":"efff738d712797a5b44028eeb4bef70a18160df1743a2356b249c401abde3ad4"} Oct 07 19:17:35 crc kubenswrapper[4825]: I1007 19:17:35.615803 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"43cb88e3-5a22-4562-86b0-b016c7ff1dcf","Type":"ContainerStarted","Data":"cee9141fb47a1fa7f46d3498be8f0110a331c1b0aa33165c937cd784d7f76724"} Oct 07 19:17:35 crc kubenswrapper[4825]: I1007 19:17:35.943458 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-mqtlv" podUID="0392f085-cd23-439c-b8aa-e3c94fc320b8" containerName="ovn-controller" probeResult="failure" output=< Oct 07 19:17:35 crc kubenswrapper[4825]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 07 19:17:35 crc kubenswrapper[4825]: > Oct 07 19:17:36 crc kubenswrapper[4825]: I1007 19:17:36.625199 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f43a8cb5-b546-476e-a429-12947216e9b0","Type":"ContainerStarted","Data":"6c136e84a59f4391d0345e16fc9393797c5051f0fd6778063acfdfa39b8ede8c"} Oct 07 19:17:36 crc kubenswrapper[4825]: I1007 19:17:36.625471 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:17:36 crc kubenswrapper[4825]: I1007 19:17:36.627757 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"19bd5f67-ab1b-4816-8e44-f792ea626299","Type":"ContainerStarted","Data":"b578eafcd8ed7708e8777b91b6e61504238a46aa50b111e1dd945c5896b307c7"} Oct 07 19:17:36 crc kubenswrapper[4825]: I1007 19:17:36.628319 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 07 19:17:36 crc kubenswrapper[4825]: I1007 19:17:36.632363 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"43cb88e3-5a22-4562-86b0-b016c7ff1dcf","Type":"ContainerStarted","Data":"e57a5ff56d181d4150f15c0271b18f9affcb296452a78c33361db8664ed00143"} Oct 07 19:17:36 crc kubenswrapper[4825]: I1007 19:17:36.649843 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=50.052658984 podStartE2EDuration="1m1.64979644s" podCreationTimestamp="2025-10-07 19:16:35 +0000 UTC" firstStartedPulling="2025-10-07 19:16:49.594475087 +0000 UTC m=+998.416513724" lastFinishedPulling="2025-10-07 19:17:01.191612543 +0000 UTC m=+1010.013651180" observedRunningTime="2025-10-07 19:17:36.646577517 +0000 UTC m=+1045.468616174" watchObservedRunningTime="2025-10-07 19:17:36.64979644 +0000 UTC m=+1045.471835077" Oct 07 19:17:36 crc kubenswrapper[4825]: I1007 19:17:36.678136 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=50.145993421 podStartE2EDuration="1m1.678117237s" podCreationTimestamp="2025-10-07 19:16:35 +0000 UTC" firstStartedPulling="2025-10-07 19:16:49.621199861 +0000 UTC m=+998.443238498" lastFinishedPulling="2025-10-07 19:17:01.153323667 +0000 UTC m=+1009.975362314" observedRunningTime="2025-10-07 19:17:36.672711414 +0000 UTC m=+1045.494750051" watchObservedRunningTime="2025-10-07 19:17:36.678117237 +0000 UTC m=+1045.500155874" Oct 07 19:17:37 crc kubenswrapper[4825]: I1007 19:17:37.644587 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"43cb88e3-5a22-4562-86b0-b016c7ff1dcf","Type":"ContainerStarted","Data":"f3ce0b4ce2eb359a09cb6ab7ec2d19e24f7a97450b25477287eb7cf600a3a341"} Oct 07 19:17:37 crc kubenswrapper[4825]: I1007 19:17:37.645146 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"43cb88e3-5a22-4562-86b0-b016c7ff1dcf","Type":"ContainerStarted","Data":"e369682e171dfbf3c0276302309c930edc2aa1b1e3ddda157b268e39ffc73990"} Oct 07 19:17:37 crc kubenswrapper[4825]: I1007 19:17:37.645158 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"43cb88e3-5a22-4562-86b0-b016c7ff1dcf","Type":"ContainerStarted","Data":"7c50befe84653196ec844188ab1a2b655fc7e1572fb2ebfef133b3502845afe3"} Oct 07 19:17:37 crc kubenswrapper[4825]: I1007 19:17:37.645168 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"43cb88e3-5a22-4562-86b0-b016c7ff1dcf","Type":"ContainerStarted","Data":"dba33960c16746c2fd27b18b8d095f963ca74bd40ed16e8100511d5b8fcfba7c"} Oct 07 19:17:37 crc kubenswrapper[4825]: I1007 19:17:37.645191 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"43cb88e3-5a22-4562-86b0-b016c7ff1dcf","Type":"ContainerStarted","Data":"c344a319577ee87ddf1bf9f1cba40a72ad770bef9d5296781e4d84810c658a69"} Oct 07 19:17:38 crc kubenswrapper[4825]: I1007 19:17:38.662778 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"43cb88e3-5a22-4562-86b0-b016c7ff1dcf","Type":"ContainerStarted","Data":"e6cc6544dfb5309ee033fb51c7d0849db7c18e5e7441a8453695bb12e0faf7e2"} Oct 07 19:17:38 crc kubenswrapper[4825]: I1007 19:17:38.749634 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.740195499 podStartE2EDuration="26.749610087s" podCreationTimestamp="2025-10-07 19:17:12 +0000 UTC" firstStartedPulling="2025-10-07 19:17:30.350009998 +0000 UTC m=+1039.172048635" lastFinishedPulling="2025-10-07 19:17:36.359424586 +0000 UTC m=+1045.181463223" observedRunningTime="2025-10-07 19:17:38.73691178 +0000 UTC m=+1047.558950497" watchObservedRunningTime="2025-10-07 19:17:38.749610087 +0000 UTC m=+1047.571648734" Oct 07 19:17:38 crc kubenswrapper[4825]: I1007 19:17:38.985714 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-thd5m"] Oct 07 19:17:38 crc kubenswrapper[4825]: E1007 19:17:38.986115 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="005059cf-fde6-46af-9e47-def8362671af" containerName="mariadb-database-create" Oct 07 19:17:38 crc kubenswrapper[4825]: I1007 19:17:38.986135 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="005059cf-fde6-46af-9e47-def8362671af" containerName="mariadb-database-create" Oct 07 19:17:38 crc kubenswrapper[4825]: E1007 19:17:38.986155 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d69393f2-e05f-41e8-89ca-a8aa9717edf1" containerName="mariadb-database-create" Oct 07 19:17:38 crc kubenswrapper[4825]: I1007 19:17:38.986165 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d69393f2-e05f-41e8-89ca-a8aa9717edf1" containerName="mariadb-database-create" Oct 07 19:17:38 crc kubenswrapper[4825]: E1007 19:17:38.986198 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e295199f-c701-41c4-a4a2-4cd8a1897681" containerName="mariadb-database-create" Oct 07 19:17:38 crc kubenswrapper[4825]: I1007 19:17:38.986206 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e295199f-c701-41c4-a4a2-4cd8a1897681" containerName="mariadb-database-create" Oct 07 19:17:38 crc kubenswrapper[4825]: I1007 19:17:38.986441 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e295199f-c701-41c4-a4a2-4cd8a1897681" containerName="mariadb-database-create" Oct 07 19:17:38 crc kubenswrapper[4825]: I1007 19:17:38.986474 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="d69393f2-e05f-41e8-89ca-a8aa9717edf1" containerName="mariadb-database-create" Oct 07 19:17:38 crc kubenswrapper[4825]: I1007 19:17:38.986511 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="005059cf-fde6-46af-9e47-def8362671af" containerName="mariadb-database-create" Oct 07 19:17:38 crc kubenswrapper[4825]: I1007 19:17:38.987372 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" Oct 07 19:17:38 crc kubenswrapper[4825]: I1007 19:17:38.989744 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 07 19:17:39 crc kubenswrapper[4825]: I1007 19:17:39.000418 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-thd5m"] Oct 07 19:17:39 crc kubenswrapper[4825]: I1007 19:17:39.078155 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-config\") pod \"dnsmasq-dns-77585f5f8c-thd5m\" (UID: \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\") " pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" Oct 07 19:17:39 crc kubenswrapper[4825]: I1007 19:17:39.078247 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpmdm\" (UniqueName: \"kubernetes.io/projected/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-kube-api-access-tpmdm\") pod \"dnsmasq-dns-77585f5f8c-thd5m\" (UID: \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\") " pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" Oct 07 19:17:39 crc kubenswrapper[4825]: I1007 19:17:39.078321 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-thd5m\" (UID: \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\") " pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" Oct 07 19:17:39 crc kubenswrapper[4825]: I1007 19:17:39.078412 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-thd5m\" (UID: \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\") " pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" Oct 07 19:17:39 crc kubenswrapper[4825]: I1007 19:17:39.078507 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-thd5m\" (UID: \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\") " pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" Oct 07 19:17:39 crc kubenswrapper[4825]: I1007 19:17:39.078547 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-thd5m\" (UID: \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\") " pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" Oct 07 19:17:39 crc kubenswrapper[4825]: I1007 19:17:39.180251 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-thd5m\" (UID: \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\") " pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" Oct 07 19:17:39 crc kubenswrapper[4825]: I1007 19:17:39.180329 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-config\") pod \"dnsmasq-dns-77585f5f8c-thd5m\" (UID: \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\") " pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" Oct 07 19:17:39 crc kubenswrapper[4825]: I1007 19:17:39.180364 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpmdm\" (UniqueName: \"kubernetes.io/projected/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-kube-api-access-tpmdm\") pod \"dnsmasq-dns-77585f5f8c-thd5m\" (UID: \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\") " pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" Oct 07 19:17:39 crc kubenswrapper[4825]: I1007 19:17:39.180425 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-thd5m\" (UID: \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\") " pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" Oct 07 19:17:39 crc kubenswrapper[4825]: I1007 19:17:39.180496 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-thd5m\" (UID: \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\") " pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" Oct 07 19:17:39 crc kubenswrapper[4825]: I1007 19:17:39.180647 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-thd5m\" (UID: \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\") " pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" Oct 07 19:17:39 crc kubenswrapper[4825]: I1007 19:17:39.181849 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-thd5m\" (UID: \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\") " pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" Oct 07 19:17:39 crc kubenswrapper[4825]: I1007 19:17:39.182724 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-thd5m\" (UID: \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\") " pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" Oct 07 19:17:39 crc kubenswrapper[4825]: I1007 19:17:39.183477 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-config\") pod \"dnsmasq-dns-77585f5f8c-thd5m\" (UID: \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\") " pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" Oct 07 19:17:39 crc kubenswrapper[4825]: I1007 19:17:39.184493 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-thd5m\" (UID: \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\") " pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" Oct 07 19:17:39 crc kubenswrapper[4825]: I1007 19:17:39.185292 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-thd5m\" (UID: \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\") " pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" Oct 07 19:17:39 crc kubenswrapper[4825]: I1007 19:17:39.209290 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpmdm\" (UniqueName: \"kubernetes.io/projected/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-kube-api-access-tpmdm\") pod \"dnsmasq-dns-77585f5f8c-thd5m\" (UID: \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\") " pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" Oct 07 19:17:39 crc kubenswrapper[4825]: I1007 19:17:39.309559 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" Oct 07 19:17:39 crc kubenswrapper[4825]: I1007 19:17:39.577455 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-thd5m"] Oct 07 19:17:39 crc kubenswrapper[4825]: W1007 19:17:39.587405 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6df6a4eb_4cdf_4080_98bb_1b23e46755cb.slice/crio-aa45d2bc4166c4326df69a017da6fdd01cc64945b71ca15244bf4cf4875b1675 WatchSource:0}: Error finding container aa45d2bc4166c4326df69a017da6fdd01cc64945b71ca15244bf4cf4875b1675: Status 404 returned error can't find the container with id aa45d2bc4166c4326df69a017da6fdd01cc64945b71ca15244bf4cf4875b1675 Oct 07 19:17:39 crc kubenswrapper[4825]: I1007 19:17:39.675023 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" event={"ID":"6df6a4eb-4cdf-4080-98bb-1b23e46755cb","Type":"ContainerStarted","Data":"aa45d2bc4166c4326df69a017da6fdd01cc64945b71ca15244bf4cf4875b1675"} Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.234040 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c5cc-account-create-2mfjn"] Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.235047 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c5cc-account-create-2mfjn" Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.238079 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.244724 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c5cc-account-create-2mfjn"] Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.395966 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd2cx\" (UniqueName: \"kubernetes.io/projected/1172748f-cd1c-42f8-86d2-1da4cd47b03a-kube-api-access-fd2cx\") pod \"keystone-c5cc-account-create-2mfjn\" (UID: \"1172748f-cd1c-42f8-86d2-1da4cd47b03a\") " pod="openstack/keystone-c5cc-account-create-2mfjn" Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.497099 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd2cx\" (UniqueName: \"kubernetes.io/projected/1172748f-cd1c-42f8-86d2-1da4cd47b03a-kube-api-access-fd2cx\") pod \"keystone-c5cc-account-create-2mfjn\" (UID: \"1172748f-cd1c-42f8-86d2-1da4cd47b03a\") " pod="openstack/keystone-c5cc-account-create-2mfjn" Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.542962 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd2cx\" (UniqueName: \"kubernetes.io/projected/1172748f-cd1c-42f8-86d2-1da4cd47b03a-kube-api-access-fd2cx\") pod \"keystone-c5cc-account-create-2mfjn\" (UID: \"1172748f-cd1c-42f8-86d2-1da4cd47b03a\") " pod="openstack/keystone-c5cc-account-create-2mfjn" Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.584279 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c5cc-account-create-2mfjn" Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.604925 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c2d6-account-create-2m5ms"] Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.606718 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c2d6-account-create-2m5ms" Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.612216 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.619471 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c2d6-account-create-2m5ms"] Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.699455 4825 generic.go:334] "Generic (PLEG): container finished" podID="6df6a4eb-4cdf-4080-98bb-1b23e46755cb" containerID="e8ce30b5917d42b82d34b2604d3a74fc9898f44bc2d2ec1715e74b921c8fc0bd" exitCode=0 Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.699560 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" event={"ID":"6df6a4eb-4cdf-4080-98bb-1b23e46755cb","Type":"ContainerDied","Data":"e8ce30b5917d42b82d34b2604d3a74fc9898f44bc2d2ec1715e74b921c8fc0bd"} Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.699656 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdgst\" (UniqueName: \"kubernetes.io/projected/ac5fcb37-d0be-4175-a3e0-b5589250ccd5-kube-api-access-pdgst\") pod \"placement-c2d6-account-create-2m5ms\" (UID: \"ac5fcb37-d0be-4175-a3e0-b5589250ccd5\") " pod="openstack/placement-c2d6-account-create-2m5ms" Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.801832 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdgst\" (UniqueName: \"kubernetes.io/projected/ac5fcb37-d0be-4175-a3e0-b5589250ccd5-kube-api-access-pdgst\") pod \"placement-c2d6-account-create-2m5ms\" (UID: \"ac5fcb37-d0be-4175-a3e0-b5589250ccd5\") " pod="openstack/placement-c2d6-account-create-2m5ms" Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.808809 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-0b20-account-create-rjd5j"] Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.811839 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0b20-account-create-rjd5j" Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.814313 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.822155 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdgst\" (UniqueName: \"kubernetes.io/projected/ac5fcb37-d0be-4175-a3e0-b5589250ccd5-kube-api-access-pdgst\") pod \"placement-c2d6-account-create-2m5ms\" (UID: \"ac5fcb37-d0be-4175-a3e0-b5589250ccd5\") " pod="openstack/placement-c2d6-account-create-2m5ms" Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.830685 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-0b20-account-create-rjd5j"] Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.878671 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c5cc-account-create-2mfjn"] Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.903504 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gstbp\" (UniqueName: \"kubernetes.io/projected/db3cf1a1-3d5d-4acf-b424-c215ca427d3e-kube-api-access-gstbp\") pod \"glance-0b20-account-create-rjd5j\" (UID: \"db3cf1a1-3d5d-4acf-b424-c215ca427d3e\") " pod="openstack/glance-0b20-account-create-rjd5j" Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.962457 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-mqtlv" podUID="0392f085-cd23-439c-b8aa-e3c94fc320b8" containerName="ovn-controller" probeResult="failure" output=< Oct 07 19:17:40 crc kubenswrapper[4825]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 07 19:17:40 crc kubenswrapper[4825]: > Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.965690 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9zcg2" Oct 07 19:17:40 crc kubenswrapper[4825]: I1007 19:17:40.992652 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c2d6-account-create-2m5ms" Oct 07 19:17:41 crc kubenswrapper[4825]: I1007 19:17:41.005216 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gstbp\" (UniqueName: \"kubernetes.io/projected/db3cf1a1-3d5d-4acf-b424-c215ca427d3e-kube-api-access-gstbp\") pod \"glance-0b20-account-create-rjd5j\" (UID: \"db3cf1a1-3d5d-4acf-b424-c215ca427d3e\") " pod="openstack/glance-0b20-account-create-rjd5j" Oct 07 19:17:41 crc kubenswrapper[4825]: I1007 19:17:41.026935 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gstbp\" (UniqueName: \"kubernetes.io/projected/db3cf1a1-3d5d-4acf-b424-c215ca427d3e-kube-api-access-gstbp\") pod \"glance-0b20-account-create-rjd5j\" (UID: \"db3cf1a1-3d5d-4acf-b424-c215ca427d3e\") " pod="openstack/glance-0b20-account-create-rjd5j" Oct 07 19:17:41 crc kubenswrapper[4825]: I1007 19:17:41.143559 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0b20-account-create-rjd5j" Oct 07 19:17:41 crc kubenswrapper[4825]: I1007 19:17:41.410064 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c2d6-account-create-2m5ms"] Oct 07 19:17:41 crc kubenswrapper[4825]: W1007 19:17:41.411309 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac5fcb37_d0be_4175_a3e0_b5589250ccd5.slice/crio-35bd63838487248ab09abb5d450ae5fadcfbae534bd5cbb3b436393ad33ab0d9 WatchSource:0}: Error finding container 35bd63838487248ab09abb5d450ae5fadcfbae534bd5cbb3b436393ad33ab0d9: Status 404 returned error can't find the container with id 35bd63838487248ab09abb5d450ae5fadcfbae534bd5cbb3b436393ad33ab0d9 Oct 07 19:17:41 crc kubenswrapper[4825]: W1007 19:17:41.609310 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb3cf1a1_3d5d_4acf_b424_c215ca427d3e.slice/crio-f5a92bf024f6238c63b4dc8353a0350e6e4762a1fd464b2a427b8529a0697853 WatchSource:0}: Error finding container f5a92bf024f6238c63b4dc8353a0350e6e4762a1fd464b2a427b8529a0697853: Status 404 returned error can't find the container with id f5a92bf024f6238c63b4dc8353a0350e6e4762a1fd464b2a427b8529a0697853 Oct 07 19:17:41 crc kubenswrapper[4825]: I1007 19:17:41.613872 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-0b20-account-create-rjd5j"] Oct 07 19:17:41 crc kubenswrapper[4825]: I1007 19:17:41.709286 4825 generic.go:334] "Generic (PLEG): container finished" podID="ac5fcb37-d0be-4175-a3e0-b5589250ccd5" containerID="777b6dcead485ffa0a13b0b10bf4cbac471ea96b1183bd40f9df13682a518ec2" exitCode=0 Oct 07 19:17:41 crc kubenswrapper[4825]: I1007 19:17:41.709354 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c2d6-account-create-2m5ms" event={"ID":"ac5fcb37-d0be-4175-a3e0-b5589250ccd5","Type":"ContainerDied","Data":"777b6dcead485ffa0a13b0b10bf4cbac471ea96b1183bd40f9df13682a518ec2"} Oct 07 19:17:41 crc kubenswrapper[4825]: I1007 19:17:41.709382 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c2d6-account-create-2m5ms" event={"ID":"ac5fcb37-d0be-4175-a3e0-b5589250ccd5","Type":"ContainerStarted","Data":"35bd63838487248ab09abb5d450ae5fadcfbae534bd5cbb3b436393ad33ab0d9"} Oct 07 19:17:41 crc kubenswrapper[4825]: I1007 19:17:41.712194 4825 generic.go:334] "Generic (PLEG): container finished" podID="1172748f-cd1c-42f8-86d2-1da4cd47b03a" containerID="adc9cabd9033f2005d4e39c9bb95c2237951be0da8326cbdc65d2292147de465" exitCode=0 Oct 07 19:17:41 crc kubenswrapper[4825]: I1007 19:17:41.712260 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c5cc-account-create-2mfjn" event={"ID":"1172748f-cd1c-42f8-86d2-1da4cd47b03a","Type":"ContainerDied","Data":"adc9cabd9033f2005d4e39c9bb95c2237951be0da8326cbdc65d2292147de465"} Oct 07 19:17:41 crc kubenswrapper[4825]: I1007 19:17:41.712298 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c5cc-account-create-2mfjn" event={"ID":"1172748f-cd1c-42f8-86d2-1da4cd47b03a","Type":"ContainerStarted","Data":"d593e76e170fda13dc03748d00947aed912f3bba7336384c27de709fbcdc2860"} Oct 07 19:17:41 crc kubenswrapper[4825]: I1007 19:17:41.713812 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0b20-account-create-rjd5j" event={"ID":"db3cf1a1-3d5d-4acf-b424-c215ca427d3e","Type":"ContainerStarted","Data":"f5a92bf024f6238c63b4dc8353a0350e6e4762a1fd464b2a427b8529a0697853"} Oct 07 19:17:41 crc kubenswrapper[4825]: I1007 19:17:41.715534 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" event={"ID":"6df6a4eb-4cdf-4080-98bb-1b23e46755cb","Type":"ContainerStarted","Data":"03374d2ad036aa2e6ad45f8e20c41a857fc989a03e2bed79eae350e12f365976"} Oct 07 19:17:41 crc kubenswrapper[4825]: I1007 19:17:41.715745 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" Oct 07 19:17:41 crc kubenswrapper[4825]: I1007 19:17:41.764116 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" podStartSLOduration=3.764098014 podStartE2EDuration="3.764098014s" podCreationTimestamp="2025-10-07 19:17:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:17:41.760609803 +0000 UTC m=+1050.582648440" watchObservedRunningTime="2025-10-07 19:17:41.764098014 +0000 UTC m=+1050.586136651" Oct 07 19:17:42 crc kubenswrapper[4825]: I1007 19:17:42.729381 4825 generic.go:334] "Generic (PLEG): container finished" podID="db3cf1a1-3d5d-4acf-b424-c215ca427d3e" containerID="fcf57d3eadb0cabfe0e63d6fc61b5ed24f23b9319bb52363dbf9f08302c69f5f" exitCode=0 Oct 07 19:17:42 crc kubenswrapper[4825]: I1007 19:17:42.729475 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0b20-account-create-rjd5j" event={"ID":"db3cf1a1-3d5d-4acf-b424-c215ca427d3e","Type":"ContainerDied","Data":"fcf57d3eadb0cabfe0e63d6fc61b5ed24f23b9319bb52363dbf9f08302c69f5f"} Oct 07 19:17:43 crc kubenswrapper[4825]: I1007 19:17:43.250282 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c5cc-account-create-2mfjn" Oct 07 19:17:43 crc kubenswrapper[4825]: I1007 19:17:43.258730 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c2d6-account-create-2m5ms" Oct 07 19:17:43 crc kubenswrapper[4825]: I1007 19:17:43.350820 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd2cx\" (UniqueName: \"kubernetes.io/projected/1172748f-cd1c-42f8-86d2-1da4cd47b03a-kube-api-access-fd2cx\") pod \"1172748f-cd1c-42f8-86d2-1da4cd47b03a\" (UID: \"1172748f-cd1c-42f8-86d2-1da4cd47b03a\") " Oct 07 19:17:43 crc kubenswrapper[4825]: I1007 19:17:43.350871 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdgst\" (UniqueName: \"kubernetes.io/projected/ac5fcb37-d0be-4175-a3e0-b5589250ccd5-kube-api-access-pdgst\") pod \"ac5fcb37-d0be-4175-a3e0-b5589250ccd5\" (UID: \"ac5fcb37-d0be-4175-a3e0-b5589250ccd5\") " Oct 07 19:17:43 crc kubenswrapper[4825]: I1007 19:17:43.357892 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac5fcb37-d0be-4175-a3e0-b5589250ccd5-kube-api-access-pdgst" (OuterVolumeSpecName: "kube-api-access-pdgst") pod "ac5fcb37-d0be-4175-a3e0-b5589250ccd5" (UID: "ac5fcb37-d0be-4175-a3e0-b5589250ccd5"). InnerVolumeSpecName "kube-api-access-pdgst". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:17:43 crc kubenswrapper[4825]: I1007 19:17:43.359412 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1172748f-cd1c-42f8-86d2-1da4cd47b03a-kube-api-access-fd2cx" (OuterVolumeSpecName: "kube-api-access-fd2cx") pod "1172748f-cd1c-42f8-86d2-1da4cd47b03a" (UID: "1172748f-cd1c-42f8-86d2-1da4cd47b03a"). InnerVolumeSpecName "kube-api-access-fd2cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:17:43 crc kubenswrapper[4825]: I1007 19:17:43.452523 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd2cx\" (UniqueName: \"kubernetes.io/projected/1172748f-cd1c-42f8-86d2-1da4cd47b03a-kube-api-access-fd2cx\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:43 crc kubenswrapper[4825]: I1007 19:17:43.452555 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdgst\" (UniqueName: \"kubernetes.io/projected/ac5fcb37-d0be-4175-a3e0-b5589250ccd5-kube-api-access-pdgst\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:43 crc kubenswrapper[4825]: I1007 19:17:43.743766 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c2d6-account-create-2m5ms" event={"ID":"ac5fcb37-d0be-4175-a3e0-b5589250ccd5","Type":"ContainerDied","Data":"35bd63838487248ab09abb5d450ae5fadcfbae534bd5cbb3b436393ad33ab0d9"} Oct 07 19:17:43 crc kubenswrapper[4825]: I1007 19:17:43.743816 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35bd63838487248ab09abb5d450ae5fadcfbae534bd5cbb3b436393ad33ab0d9" Oct 07 19:17:43 crc kubenswrapper[4825]: I1007 19:17:43.743835 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c2d6-account-create-2m5ms" Oct 07 19:17:43 crc kubenswrapper[4825]: I1007 19:17:43.747361 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c5cc-account-create-2mfjn" event={"ID":"1172748f-cd1c-42f8-86d2-1da4cd47b03a","Type":"ContainerDied","Data":"d593e76e170fda13dc03748d00947aed912f3bba7336384c27de709fbcdc2860"} Oct 07 19:17:43 crc kubenswrapper[4825]: I1007 19:17:43.747396 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d593e76e170fda13dc03748d00947aed912f3bba7336384c27de709fbcdc2860" Oct 07 19:17:43 crc kubenswrapper[4825]: I1007 19:17:43.747430 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c5cc-account-create-2mfjn" Oct 07 19:17:44 crc kubenswrapper[4825]: I1007 19:17:44.023447 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0b20-account-create-rjd5j" Oct 07 19:17:44 crc kubenswrapper[4825]: I1007 19:17:44.162384 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gstbp\" (UniqueName: \"kubernetes.io/projected/db3cf1a1-3d5d-4acf-b424-c215ca427d3e-kube-api-access-gstbp\") pod \"db3cf1a1-3d5d-4acf-b424-c215ca427d3e\" (UID: \"db3cf1a1-3d5d-4acf-b424-c215ca427d3e\") " Oct 07 19:17:44 crc kubenswrapper[4825]: I1007 19:17:44.168369 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db3cf1a1-3d5d-4acf-b424-c215ca427d3e-kube-api-access-gstbp" (OuterVolumeSpecName: "kube-api-access-gstbp") pod "db3cf1a1-3d5d-4acf-b424-c215ca427d3e" (UID: "db3cf1a1-3d5d-4acf-b424-c215ca427d3e"). InnerVolumeSpecName "kube-api-access-gstbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:17:44 crc kubenswrapper[4825]: I1007 19:17:44.264878 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gstbp\" (UniqueName: \"kubernetes.io/projected/db3cf1a1-3d5d-4acf-b424-c215ca427d3e-kube-api-access-gstbp\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:44 crc kubenswrapper[4825]: I1007 19:17:44.761563 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0b20-account-create-rjd5j" event={"ID":"db3cf1a1-3d5d-4acf-b424-c215ca427d3e","Type":"ContainerDied","Data":"f5a92bf024f6238c63b4dc8353a0350e6e4762a1fd464b2a427b8529a0697853"} Oct 07 19:17:44 crc kubenswrapper[4825]: I1007 19:17:44.762687 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5a92bf024f6238c63b4dc8353a0350e6e4762a1fd464b2a427b8529a0697853" Oct 07 19:17:44 crc kubenswrapper[4825]: I1007 19:17:44.761666 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0b20-account-create-rjd5j" Oct 07 19:17:45 crc kubenswrapper[4825]: I1007 19:17:45.982589 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-mqtlv" podUID="0392f085-cd23-439c-b8aa-e3c94fc320b8" containerName="ovn-controller" probeResult="failure" output=< Oct 07 19:17:45 crc kubenswrapper[4825]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 07 19:17:45 crc kubenswrapper[4825]: > Oct 07 19:17:45 crc kubenswrapper[4825]: I1007 19:17:45.997574 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9zcg2" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.037200 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-cw4gf"] Oct 07 19:17:46 crc kubenswrapper[4825]: E1007 19:17:46.037947 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1172748f-cd1c-42f8-86d2-1da4cd47b03a" containerName="mariadb-account-create" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.037965 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1172748f-cd1c-42f8-86d2-1da4cd47b03a" containerName="mariadb-account-create" Oct 07 19:17:46 crc kubenswrapper[4825]: E1007 19:17:46.037993 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac5fcb37-d0be-4175-a3e0-b5589250ccd5" containerName="mariadb-account-create" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.037999 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5fcb37-d0be-4175-a3e0-b5589250ccd5" containerName="mariadb-account-create" Oct 07 19:17:46 crc kubenswrapper[4825]: E1007 19:17:46.038024 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db3cf1a1-3d5d-4acf-b424-c215ca427d3e" containerName="mariadb-account-create" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.038032 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="db3cf1a1-3d5d-4acf-b424-c215ca427d3e" containerName="mariadb-account-create" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.038175 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="1172748f-cd1c-42f8-86d2-1da4cd47b03a" containerName="mariadb-account-create" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.038208 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac5fcb37-d0be-4175-a3e0-b5589250ccd5" containerName="mariadb-account-create" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.038217 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="db3cf1a1-3d5d-4acf-b424-c215ca427d3e" containerName="mariadb-account-create" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.039048 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-cw4gf" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.041057 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-gq8x4" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.041598 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.046149 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-cw4gf"] Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.201561 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mqtlv-config-7vc5g"] Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.201764 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e981b526-0afb-4a9c-ba89-fe87728f4603-combined-ca-bundle\") pod \"glance-db-sync-cw4gf\" (UID: \"e981b526-0afb-4a9c-ba89-fe87728f4603\") " pod="openstack/glance-db-sync-cw4gf" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.201883 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e981b526-0afb-4a9c-ba89-fe87728f4603-db-sync-config-data\") pod \"glance-db-sync-cw4gf\" (UID: \"e981b526-0afb-4a9c-ba89-fe87728f4603\") " pod="openstack/glance-db-sync-cw4gf" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.201939 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh9ds\" (UniqueName: \"kubernetes.io/projected/e981b526-0afb-4a9c-ba89-fe87728f4603-kube-api-access-kh9ds\") pod \"glance-db-sync-cw4gf\" (UID: \"e981b526-0afb-4a9c-ba89-fe87728f4603\") " pod="openstack/glance-db-sync-cw4gf" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.202013 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e981b526-0afb-4a9c-ba89-fe87728f4603-config-data\") pod \"glance-db-sync-cw4gf\" (UID: \"e981b526-0afb-4a9c-ba89-fe87728f4603\") " pod="openstack/glance-db-sync-cw4gf" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.202691 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mqtlv-config-7vc5g" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.205791 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.211533 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mqtlv-config-7vc5g"] Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.303850 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e981b526-0afb-4a9c-ba89-fe87728f4603-combined-ca-bundle\") pod \"glance-db-sync-cw4gf\" (UID: \"e981b526-0afb-4a9c-ba89-fe87728f4603\") " pod="openstack/glance-db-sync-cw4gf" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.303932 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ee871f13-1e33-47c7-aad9-849657a07bbb-var-run\") pod \"ovn-controller-mqtlv-config-7vc5g\" (UID: \"ee871f13-1e33-47c7-aad9-849657a07bbb\") " pod="openstack/ovn-controller-mqtlv-config-7vc5g" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.304029 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e981b526-0afb-4a9c-ba89-fe87728f4603-db-sync-config-data\") pod \"glance-db-sync-cw4gf\" (UID: \"e981b526-0afb-4a9c-ba89-fe87728f4603\") " pod="openstack/glance-db-sync-cw4gf" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.304082 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh9ds\" (UniqueName: \"kubernetes.io/projected/e981b526-0afb-4a9c-ba89-fe87728f4603-kube-api-access-kh9ds\") pod \"glance-db-sync-cw4gf\" (UID: \"e981b526-0afb-4a9c-ba89-fe87728f4603\") " pod="openstack/glance-db-sync-cw4gf" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.304122 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ee871f13-1e33-47c7-aad9-849657a07bbb-additional-scripts\") pod \"ovn-controller-mqtlv-config-7vc5g\" (UID: \"ee871f13-1e33-47c7-aad9-849657a07bbb\") " pod="openstack/ovn-controller-mqtlv-config-7vc5g" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.304187 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e981b526-0afb-4a9c-ba89-fe87728f4603-config-data\") pod \"glance-db-sync-cw4gf\" (UID: \"e981b526-0afb-4a9c-ba89-fe87728f4603\") " pod="openstack/glance-db-sync-cw4gf" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.304273 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ee871f13-1e33-47c7-aad9-849657a07bbb-var-log-ovn\") pod \"ovn-controller-mqtlv-config-7vc5g\" (UID: \"ee871f13-1e33-47c7-aad9-849657a07bbb\") " pod="openstack/ovn-controller-mqtlv-config-7vc5g" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.304320 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdqgm\" (UniqueName: \"kubernetes.io/projected/ee871f13-1e33-47c7-aad9-849657a07bbb-kube-api-access-pdqgm\") pod \"ovn-controller-mqtlv-config-7vc5g\" (UID: \"ee871f13-1e33-47c7-aad9-849657a07bbb\") " pod="openstack/ovn-controller-mqtlv-config-7vc5g" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.304525 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee871f13-1e33-47c7-aad9-849657a07bbb-var-run-ovn\") pod \"ovn-controller-mqtlv-config-7vc5g\" (UID: \"ee871f13-1e33-47c7-aad9-849657a07bbb\") " pod="openstack/ovn-controller-mqtlv-config-7vc5g" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.304683 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee871f13-1e33-47c7-aad9-849657a07bbb-scripts\") pod \"ovn-controller-mqtlv-config-7vc5g\" (UID: \"ee871f13-1e33-47c7-aad9-849657a07bbb\") " pod="openstack/ovn-controller-mqtlv-config-7vc5g" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.312105 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e981b526-0afb-4a9c-ba89-fe87728f4603-db-sync-config-data\") pod \"glance-db-sync-cw4gf\" (UID: \"e981b526-0afb-4a9c-ba89-fe87728f4603\") " pod="openstack/glance-db-sync-cw4gf" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.312684 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e981b526-0afb-4a9c-ba89-fe87728f4603-combined-ca-bundle\") pod \"glance-db-sync-cw4gf\" (UID: \"e981b526-0afb-4a9c-ba89-fe87728f4603\") " pod="openstack/glance-db-sync-cw4gf" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.313674 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e981b526-0afb-4a9c-ba89-fe87728f4603-config-data\") pod \"glance-db-sync-cw4gf\" (UID: \"e981b526-0afb-4a9c-ba89-fe87728f4603\") " pod="openstack/glance-db-sync-cw4gf" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.336960 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh9ds\" (UniqueName: \"kubernetes.io/projected/e981b526-0afb-4a9c-ba89-fe87728f4603-kube-api-access-kh9ds\") pod \"glance-db-sync-cw4gf\" (UID: \"e981b526-0afb-4a9c-ba89-fe87728f4603\") " pod="openstack/glance-db-sync-cw4gf" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.357191 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-cw4gf" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.406692 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ee871f13-1e33-47c7-aad9-849657a07bbb-additional-scripts\") pod \"ovn-controller-mqtlv-config-7vc5g\" (UID: \"ee871f13-1e33-47c7-aad9-849657a07bbb\") " pod="openstack/ovn-controller-mqtlv-config-7vc5g" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.406825 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ee871f13-1e33-47c7-aad9-849657a07bbb-var-log-ovn\") pod \"ovn-controller-mqtlv-config-7vc5g\" (UID: \"ee871f13-1e33-47c7-aad9-849657a07bbb\") " pod="openstack/ovn-controller-mqtlv-config-7vc5g" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.406872 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdqgm\" (UniqueName: \"kubernetes.io/projected/ee871f13-1e33-47c7-aad9-849657a07bbb-kube-api-access-pdqgm\") pod \"ovn-controller-mqtlv-config-7vc5g\" (UID: \"ee871f13-1e33-47c7-aad9-849657a07bbb\") " pod="openstack/ovn-controller-mqtlv-config-7vc5g" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.406953 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee871f13-1e33-47c7-aad9-849657a07bbb-var-run-ovn\") pod \"ovn-controller-mqtlv-config-7vc5g\" (UID: \"ee871f13-1e33-47c7-aad9-849657a07bbb\") " pod="openstack/ovn-controller-mqtlv-config-7vc5g" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.407031 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee871f13-1e33-47c7-aad9-849657a07bbb-scripts\") pod \"ovn-controller-mqtlv-config-7vc5g\" (UID: \"ee871f13-1e33-47c7-aad9-849657a07bbb\") " pod="openstack/ovn-controller-mqtlv-config-7vc5g" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.407101 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ee871f13-1e33-47c7-aad9-849657a07bbb-var-run\") pod \"ovn-controller-mqtlv-config-7vc5g\" (UID: \"ee871f13-1e33-47c7-aad9-849657a07bbb\") " pod="openstack/ovn-controller-mqtlv-config-7vc5g" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.407505 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ee871f13-1e33-47c7-aad9-849657a07bbb-var-run\") pod \"ovn-controller-mqtlv-config-7vc5g\" (UID: \"ee871f13-1e33-47c7-aad9-849657a07bbb\") " pod="openstack/ovn-controller-mqtlv-config-7vc5g" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.409523 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee871f13-1e33-47c7-aad9-849657a07bbb-var-run-ovn\") pod \"ovn-controller-mqtlv-config-7vc5g\" (UID: \"ee871f13-1e33-47c7-aad9-849657a07bbb\") " pod="openstack/ovn-controller-mqtlv-config-7vc5g" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.409625 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ee871f13-1e33-47c7-aad9-849657a07bbb-var-log-ovn\") pod \"ovn-controller-mqtlv-config-7vc5g\" (UID: \"ee871f13-1e33-47c7-aad9-849657a07bbb\") " pod="openstack/ovn-controller-mqtlv-config-7vc5g" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.411883 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee871f13-1e33-47c7-aad9-849657a07bbb-scripts\") pod \"ovn-controller-mqtlv-config-7vc5g\" (UID: \"ee871f13-1e33-47c7-aad9-849657a07bbb\") " pod="openstack/ovn-controller-mqtlv-config-7vc5g" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.413129 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ee871f13-1e33-47c7-aad9-849657a07bbb-additional-scripts\") pod \"ovn-controller-mqtlv-config-7vc5g\" (UID: \"ee871f13-1e33-47c7-aad9-849657a07bbb\") " pod="openstack/ovn-controller-mqtlv-config-7vc5g" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.427273 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdqgm\" (UniqueName: \"kubernetes.io/projected/ee871f13-1e33-47c7-aad9-849657a07bbb-kube-api-access-pdqgm\") pod \"ovn-controller-mqtlv-config-7vc5g\" (UID: \"ee871f13-1e33-47c7-aad9-849657a07bbb\") " pod="openstack/ovn-controller-mqtlv-config-7vc5g" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.546611 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mqtlv-config-7vc5g" Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.929150 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-cw4gf"] Oct 07 19:17:46 crc kubenswrapper[4825]: I1007 19:17:46.932412 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 07 19:17:47 crc kubenswrapper[4825]: W1007 19:17:47.030809 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee871f13_1e33_47c7_aad9_849657a07bbb.slice/crio-84cd778a680a4eed01cbff5d1f1a9ff1097b8409913b7f355f2ae5b28b022fb4 WatchSource:0}: Error finding container 84cd778a680a4eed01cbff5d1f1a9ff1097b8409913b7f355f2ae5b28b022fb4: Status 404 returned error can't find the container with id 84cd778a680a4eed01cbff5d1f1a9ff1097b8409913b7f355f2ae5b28b022fb4 Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.033919 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mqtlv-config-7vc5g"] Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.232398 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.334370 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-p8xwm"] Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.335588 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-p8xwm" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.354770 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-p8xwm"] Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.437655 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrdjf\" (UniqueName: \"kubernetes.io/projected/ab5d00f4-8395-4b1b-af14-687bbb9071c5-kube-api-access-hrdjf\") pod \"cinder-db-create-p8xwm\" (UID: \"ab5d00f4-8395-4b1b-af14-687bbb9071c5\") " pod="openstack/cinder-db-create-p8xwm" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.474596 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-f5sff"] Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.475877 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f5sff" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.481620 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-f5sff"] Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.539393 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7dz7\" (UniqueName: \"kubernetes.io/projected/7e2e5b54-c75b-4d58-aff1-ea98ac2f6dd2-kube-api-access-p7dz7\") pod \"barbican-db-create-f5sff\" (UID: \"7e2e5b54-c75b-4d58-aff1-ea98ac2f6dd2\") " pod="openstack/barbican-db-create-f5sff" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.539520 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrdjf\" (UniqueName: \"kubernetes.io/projected/ab5d00f4-8395-4b1b-af14-687bbb9071c5-kube-api-access-hrdjf\") pod \"cinder-db-create-p8xwm\" (UID: \"ab5d00f4-8395-4b1b-af14-687bbb9071c5\") " pod="openstack/cinder-db-create-p8xwm" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.563901 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrdjf\" (UniqueName: \"kubernetes.io/projected/ab5d00f4-8395-4b1b-af14-687bbb9071c5-kube-api-access-hrdjf\") pod \"cinder-db-create-p8xwm\" (UID: \"ab5d00f4-8395-4b1b-af14-687bbb9071c5\") " pod="openstack/cinder-db-create-p8xwm" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.583068 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-rnvqv"] Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.584186 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rnvqv" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.585936 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.587205 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.587367 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.587482 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-w9rfd" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.593290 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-rnvqv"] Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.633616 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-hrjbr"] Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.635306 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hrjbr" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.638840 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hrjbr"] Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.641061 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7dz7\" (UniqueName: \"kubernetes.io/projected/7e2e5b54-c75b-4d58-aff1-ea98ac2f6dd2-kube-api-access-p7dz7\") pod \"barbican-db-create-f5sff\" (UID: \"7e2e5b54-c75b-4d58-aff1-ea98ac2f6dd2\") " pod="openstack/barbican-db-create-f5sff" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.674610 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-p8xwm" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.676458 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7dz7\" (UniqueName: \"kubernetes.io/projected/7e2e5b54-c75b-4d58-aff1-ea98ac2f6dd2-kube-api-access-p7dz7\") pod \"barbican-db-create-f5sff\" (UID: \"7e2e5b54-c75b-4d58-aff1-ea98ac2f6dd2\") " pod="openstack/barbican-db-create-f5sff" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.742783 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c38775cb-8a0d-4834-8215-af35a9fd4952-combined-ca-bundle\") pod \"keystone-db-sync-rnvqv\" (UID: \"c38775cb-8a0d-4834-8215-af35a9fd4952\") " pod="openstack/keystone-db-sync-rnvqv" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.742887 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64bwf\" (UniqueName: \"kubernetes.io/projected/ac31bb14-31ca-44cc-9e94-6af59e03a578-kube-api-access-64bwf\") pod \"neutron-db-create-hrjbr\" (UID: \"ac31bb14-31ca-44cc-9e94-6af59e03a578\") " pod="openstack/neutron-db-create-hrjbr" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.742944 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk6qd\" (UniqueName: \"kubernetes.io/projected/c38775cb-8a0d-4834-8215-af35a9fd4952-kube-api-access-bk6qd\") pod \"keystone-db-sync-rnvqv\" (UID: \"c38775cb-8a0d-4834-8215-af35a9fd4952\") " pod="openstack/keystone-db-sync-rnvqv" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.742978 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c38775cb-8a0d-4834-8215-af35a9fd4952-config-data\") pod \"keystone-db-sync-rnvqv\" (UID: \"c38775cb-8a0d-4834-8215-af35a9fd4952\") " pod="openstack/keystone-db-sync-rnvqv" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.799283 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f5sff" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.814701 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-cw4gf" event={"ID":"e981b526-0afb-4a9c-ba89-fe87728f4603","Type":"ContainerStarted","Data":"d13825f1b815c92638c69f6725aa09f0f1b96cee691e74b679418aece4ca46f8"} Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.814744 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mqtlv-config-7vc5g" event={"ID":"ee871f13-1e33-47c7-aad9-849657a07bbb","Type":"ContainerStarted","Data":"397f277ef97bc2edcb05128de9b277667c3730f74dce3268c7e03df1a242382a"} Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.814763 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mqtlv-config-7vc5g" event={"ID":"ee871f13-1e33-47c7-aad9-849657a07bbb","Type":"ContainerStarted","Data":"84cd778a680a4eed01cbff5d1f1a9ff1097b8409913b7f355f2ae5b28b022fb4"} Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.831474 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-mqtlv-config-7vc5g" podStartSLOduration=1.831458346 podStartE2EDuration="1.831458346s" podCreationTimestamp="2025-10-07 19:17:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:17:47.830692581 +0000 UTC m=+1056.652731218" watchObservedRunningTime="2025-10-07 19:17:47.831458346 +0000 UTC m=+1056.653496973" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.844400 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c38775cb-8a0d-4834-8215-af35a9fd4952-combined-ca-bundle\") pod \"keystone-db-sync-rnvqv\" (UID: \"c38775cb-8a0d-4834-8215-af35a9fd4952\") " pod="openstack/keystone-db-sync-rnvqv" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.844492 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64bwf\" (UniqueName: \"kubernetes.io/projected/ac31bb14-31ca-44cc-9e94-6af59e03a578-kube-api-access-64bwf\") pod \"neutron-db-create-hrjbr\" (UID: \"ac31bb14-31ca-44cc-9e94-6af59e03a578\") " pod="openstack/neutron-db-create-hrjbr" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.844542 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk6qd\" (UniqueName: \"kubernetes.io/projected/c38775cb-8a0d-4834-8215-af35a9fd4952-kube-api-access-bk6qd\") pod \"keystone-db-sync-rnvqv\" (UID: \"c38775cb-8a0d-4834-8215-af35a9fd4952\") " pod="openstack/keystone-db-sync-rnvqv" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.844585 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c38775cb-8a0d-4834-8215-af35a9fd4952-config-data\") pod \"keystone-db-sync-rnvqv\" (UID: \"c38775cb-8a0d-4834-8215-af35a9fd4952\") " pod="openstack/keystone-db-sync-rnvqv" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.850093 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c38775cb-8a0d-4834-8215-af35a9fd4952-combined-ca-bundle\") pod \"keystone-db-sync-rnvqv\" (UID: \"c38775cb-8a0d-4834-8215-af35a9fd4952\") " pod="openstack/keystone-db-sync-rnvqv" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.854963 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c38775cb-8a0d-4834-8215-af35a9fd4952-config-data\") pod \"keystone-db-sync-rnvqv\" (UID: \"c38775cb-8a0d-4834-8215-af35a9fd4952\") " pod="openstack/keystone-db-sync-rnvqv" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.865879 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk6qd\" (UniqueName: \"kubernetes.io/projected/c38775cb-8a0d-4834-8215-af35a9fd4952-kube-api-access-bk6qd\") pod \"keystone-db-sync-rnvqv\" (UID: \"c38775cb-8a0d-4834-8215-af35a9fd4952\") " pod="openstack/keystone-db-sync-rnvqv" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.868456 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64bwf\" (UniqueName: \"kubernetes.io/projected/ac31bb14-31ca-44cc-9e94-6af59e03a578-kube-api-access-64bwf\") pod \"neutron-db-create-hrjbr\" (UID: \"ac31bb14-31ca-44cc-9e94-6af59e03a578\") " pod="openstack/neutron-db-create-hrjbr" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.930250 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rnvqv" Oct 07 19:17:47 crc kubenswrapper[4825]: I1007 19:17:47.953610 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hrjbr" Oct 07 19:17:48 crc kubenswrapper[4825]: I1007 19:17:48.130969 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-p8xwm"] Oct 07 19:17:48 crc kubenswrapper[4825]: I1007 19:17:48.329511 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-f5sff"] Oct 07 19:17:48 crc kubenswrapper[4825]: W1007 19:17:48.335926 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e2e5b54_c75b_4d58_aff1_ea98ac2f6dd2.slice/crio-1e211c81f69af3dbe21d98da32ec9788cba30c311dbacbd40326200f2db02d3d WatchSource:0}: Error finding container 1e211c81f69af3dbe21d98da32ec9788cba30c311dbacbd40326200f2db02d3d: Status 404 returned error can't find the container with id 1e211c81f69af3dbe21d98da32ec9788cba30c311dbacbd40326200f2db02d3d Oct 07 19:17:48 crc kubenswrapper[4825]: I1007 19:17:48.541731 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-rnvqv"] Oct 07 19:17:48 crc kubenswrapper[4825]: I1007 19:17:48.555418 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hrjbr"] Oct 07 19:17:48 crc kubenswrapper[4825]: W1007 19:17:48.559914 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc38775cb_8a0d_4834_8215_af35a9fd4952.slice/crio-10e01eccdfe5a3457e823265571c43b0c55f6078e516cc140da060b55c868313 WatchSource:0}: Error finding container 10e01eccdfe5a3457e823265571c43b0c55f6078e516cc140da060b55c868313: Status 404 returned error can't find the container with id 10e01eccdfe5a3457e823265571c43b0c55f6078e516cc140da060b55c868313 Oct 07 19:17:48 crc kubenswrapper[4825]: I1007 19:17:48.825598 4825 generic.go:334] "Generic (PLEG): container finished" podID="ab5d00f4-8395-4b1b-af14-687bbb9071c5" containerID="4186850b8f8a80925779fceac05d91bf622138245e945fddc1a9f23e64e45757" exitCode=0 Oct 07 19:17:48 crc kubenswrapper[4825]: I1007 19:17:48.825664 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-p8xwm" event={"ID":"ab5d00f4-8395-4b1b-af14-687bbb9071c5","Type":"ContainerDied","Data":"4186850b8f8a80925779fceac05d91bf622138245e945fddc1a9f23e64e45757"} Oct 07 19:17:48 crc kubenswrapper[4825]: I1007 19:17:48.825862 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-p8xwm" event={"ID":"ab5d00f4-8395-4b1b-af14-687bbb9071c5","Type":"ContainerStarted","Data":"767d13c4e96753cb0132d818a62599454b7e2431fb63185bb362aa067cd540e3"} Oct 07 19:17:48 crc kubenswrapper[4825]: I1007 19:17:48.830796 4825 generic.go:334] "Generic (PLEG): container finished" podID="ee871f13-1e33-47c7-aad9-849657a07bbb" containerID="397f277ef97bc2edcb05128de9b277667c3730f74dce3268c7e03df1a242382a" exitCode=0 Oct 07 19:17:48 crc kubenswrapper[4825]: I1007 19:17:48.831106 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mqtlv-config-7vc5g" event={"ID":"ee871f13-1e33-47c7-aad9-849657a07bbb","Type":"ContainerDied","Data":"397f277ef97bc2edcb05128de9b277667c3730f74dce3268c7e03df1a242382a"} Oct 07 19:17:48 crc kubenswrapper[4825]: I1007 19:17:48.833535 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rnvqv" event={"ID":"c38775cb-8a0d-4834-8215-af35a9fd4952","Type":"ContainerStarted","Data":"10e01eccdfe5a3457e823265571c43b0c55f6078e516cc140da060b55c868313"} Oct 07 19:17:48 crc kubenswrapper[4825]: I1007 19:17:48.844850 4825 generic.go:334] "Generic (PLEG): container finished" podID="7e2e5b54-c75b-4d58-aff1-ea98ac2f6dd2" containerID="9ec338039e8e81ace96a3c1e412c1c5618237af373575f561bc1f48885dcfd88" exitCode=0 Oct 07 19:17:48 crc kubenswrapper[4825]: I1007 19:17:48.844931 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f5sff" event={"ID":"7e2e5b54-c75b-4d58-aff1-ea98ac2f6dd2","Type":"ContainerDied","Data":"9ec338039e8e81ace96a3c1e412c1c5618237af373575f561bc1f48885dcfd88"} Oct 07 19:17:48 crc kubenswrapper[4825]: I1007 19:17:48.844957 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f5sff" event={"ID":"7e2e5b54-c75b-4d58-aff1-ea98ac2f6dd2","Type":"ContainerStarted","Data":"1e211c81f69af3dbe21d98da32ec9788cba30c311dbacbd40326200f2db02d3d"} Oct 07 19:17:48 crc kubenswrapper[4825]: I1007 19:17:48.846467 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hrjbr" event={"ID":"ac31bb14-31ca-44cc-9e94-6af59e03a578","Type":"ContainerStarted","Data":"db624b877ed6d5db20cf7a39f12cb7cd43174e4cc587d31fefb59e7a84de6611"} Oct 07 19:17:49 crc kubenswrapper[4825]: I1007 19:17:49.311219 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" Oct 07 19:17:49 crc kubenswrapper[4825]: I1007 19:17:49.370980 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-9vwm4"] Oct 07 19:17:49 crc kubenswrapper[4825]: I1007 19:17:49.388886 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-9vwm4" podUID="41179ae7-4ff4-4c39-81d2-8867a63917e6" containerName="dnsmasq-dns" containerID="cri-o://fbf4f01494b54defbea82af8c554f6165fc11ad711f24a704b0e02d1c1ebec19" gracePeriod=10 Oct 07 19:17:49 crc kubenswrapper[4825]: I1007 19:17:49.869360 4825 generic.go:334] "Generic (PLEG): container finished" podID="ac31bb14-31ca-44cc-9e94-6af59e03a578" containerID="614bb105602c607a6a4de778d21c754de7d7b420a89dec2b5999fc9ad9338df3" exitCode=0 Oct 07 19:17:49 crc kubenswrapper[4825]: I1007 19:17:49.869507 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hrjbr" event={"ID":"ac31bb14-31ca-44cc-9e94-6af59e03a578","Type":"ContainerDied","Data":"614bb105602c607a6a4de778d21c754de7d7b420a89dec2b5999fc9ad9338df3"} Oct 07 19:17:49 crc kubenswrapper[4825]: I1007 19:17:49.872589 4825 generic.go:334] "Generic (PLEG): container finished" podID="41179ae7-4ff4-4c39-81d2-8867a63917e6" containerID="fbf4f01494b54defbea82af8c554f6165fc11ad711f24a704b0e02d1c1ebec19" exitCode=0 Oct 07 19:17:49 crc kubenswrapper[4825]: I1007 19:17:49.872751 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-9vwm4" event={"ID":"41179ae7-4ff4-4c39-81d2-8867a63917e6","Type":"ContainerDied","Data":"fbf4f01494b54defbea82af8c554f6165fc11ad711f24a704b0e02d1c1ebec19"} Oct 07 19:17:49 crc kubenswrapper[4825]: I1007 19:17:49.945769 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-9vwm4" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.134353 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41179ae7-4ff4-4c39-81d2-8867a63917e6-ovsdbserver-nb\") pod \"41179ae7-4ff4-4c39-81d2-8867a63917e6\" (UID: \"41179ae7-4ff4-4c39-81d2-8867a63917e6\") " Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.134425 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41179ae7-4ff4-4c39-81d2-8867a63917e6-config\") pod \"41179ae7-4ff4-4c39-81d2-8867a63917e6\" (UID: \"41179ae7-4ff4-4c39-81d2-8867a63917e6\") " Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.134484 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41179ae7-4ff4-4c39-81d2-8867a63917e6-dns-svc\") pod \"41179ae7-4ff4-4c39-81d2-8867a63917e6\" (UID: \"41179ae7-4ff4-4c39-81d2-8867a63917e6\") " Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.134565 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41179ae7-4ff4-4c39-81d2-8867a63917e6-ovsdbserver-sb\") pod \"41179ae7-4ff4-4c39-81d2-8867a63917e6\" (UID: \"41179ae7-4ff4-4c39-81d2-8867a63917e6\") " Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.134598 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxmfk\" (UniqueName: \"kubernetes.io/projected/41179ae7-4ff4-4c39-81d2-8867a63917e6-kube-api-access-gxmfk\") pod \"41179ae7-4ff4-4c39-81d2-8867a63917e6\" (UID: \"41179ae7-4ff4-4c39-81d2-8867a63917e6\") " Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.161012 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41179ae7-4ff4-4c39-81d2-8867a63917e6-kube-api-access-gxmfk" (OuterVolumeSpecName: "kube-api-access-gxmfk") pod "41179ae7-4ff4-4c39-81d2-8867a63917e6" (UID: "41179ae7-4ff4-4c39-81d2-8867a63917e6"). InnerVolumeSpecName "kube-api-access-gxmfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.183575 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41179ae7-4ff4-4c39-81d2-8867a63917e6-config" (OuterVolumeSpecName: "config") pod "41179ae7-4ff4-4c39-81d2-8867a63917e6" (UID: "41179ae7-4ff4-4c39-81d2-8867a63917e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.200298 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41179ae7-4ff4-4c39-81d2-8867a63917e6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "41179ae7-4ff4-4c39-81d2-8867a63917e6" (UID: "41179ae7-4ff4-4c39-81d2-8867a63917e6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.209292 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41179ae7-4ff4-4c39-81d2-8867a63917e6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "41179ae7-4ff4-4c39-81d2-8867a63917e6" (UID: "41179ae7-4ff4-4c39-81d2-8867a63917e6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.222426 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41179ae7-4ff4-4c39-81d2-8867a63917e6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "41179ae7-4ff4-4c39-81d2-8867a63917e6" (UID: "41179ae7-4ff4-4c39-81d2-8867a63917e6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.235625 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41179ae7-4ff4-4c39-81d2-8867a63917e6-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.235654 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41179ae7-4ff4-4c39-81d2-8867a63917e6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.235664 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41179ae7-4ff4-4c39-81d2-8867a63917e6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.235675 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxmfk\" (UniqueName: \"kubernetes.io/projected/41179ae7-4ff4-4c39-81d2-8867a63917e6-kube-api-access-gxmfk\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.235684 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41179ae7-4ff4-4c39-81d2-8867a63917e6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.261478 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f5sff" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.280613 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mqtlv-config-7vc5g" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.336649 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee871f13-1e33-47c7-aad9-849657a07bbb-scripts\") pod \"ee871f13-1e33-47c7-aad9-849657a07bbb\" (UID: \"ee871f13-1e33-47c7-aad9-849657a07bbb\") " Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.336720 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ee871f13-1e33-47c7-aad9-849657a07bbb-var-run\") pod \"ee871f13-1e33-47c7-aad9-849657a07bbb\" (UID: \"ee871f13-1e33-47c7-aad9-849657a07bbb\") " Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.336763 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7dz7\" (UniqueName: \"kubernetes.io/projected/7e2e5b54-c75b-4d58-aff1-ea98ac2f6dd2-kube-api-access-p7dz7\") pod \"7e2e5b54-c75b-4d58-aff1-ea98ac2f6dd2\" (UID: \"7e2e5b54-c75b-4d58-aff1-ea98ac2f6dd2\") " Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.336815 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee871f13-1e33-47c7-aad9-849657a07bbb-var-run-ovn\") pod \"ee871f13-1e33-47c7-aad9-849657a07bbb\" (UID: \"ee871f13-1e33-47c7-aad9-849657a07bbb\") " Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.336860 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ee871f13-1e33-47c7-aad9-849657a07bbb-additional-scripts\") pod \"ee871f13-1e33-47c7-aad9-849657a07bbb\" (UID: \"ee871f13-1e33-47c7-aad9-849657a07bbb\") " Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.336890 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdqgm\" (UniqueName: \"kubernetes.io/projected/ee871f13-1e33-47c7-aad9-849657a07bbb-kube-api-access-pdqgm\") pod \"ee871f13-1e33-47c7-aad9-849657a07bbb\" (UID: \"ee871f13-1e33-47c7-aad9-849657a07bbb\") " Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.336939 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ee871f13-1e33-47c7-aad9-849657a07bbb-var-log-ovn\") pod \"ee871f13-1e33-47c7-aad9-849657a07bbb\" (UID: \"ee871f13-1e33-47c7-aad9-849657a07bbb\") " Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.337259 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee871f13-1e33-47c7-aad9-849657a07bbb-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ee871f13-1e33-47c7-aad9-849657a07bbb" (UID: "ee871f13-1e33-47c7-aad9-849657a07bbb"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.337331 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee871f13-1e33-47c7-aad9-849657a07bbb-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ee871f13-1e33-47c7-aad9-849657a07bbb" (UID: "ee871f13-1e33-47c7-aad9-849657a07bbb"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.337467 4825 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ee871f13-1e33-47c7-aad9-849657a07bbb-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.337481 4825 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee871f13-1e33-47c7-aad9-849657a07bbb-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.337500 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee871f13-1e33-47c7-aad9-849657a07bbb-var-run" (OuterVolumeSpecName: "var-run") pod "ee871f13-1e33-47c7-aad9-849657a07bbb" (UID: "ee871f13-1e33-47c7-aad9-849657a07bbb"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.337943 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee871f13-1e33-47c7-aad9-849657a07bbb-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ee871f13-1e33-47c7-aad9-849657a07bbb" (UID: "ee871f13-1e33-47c7-aad9-849657a07bbb"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.338122 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee871f13-1e33-47c7-aad9-849657a07bbb-scripts" (OuterVolumeSpecName: "scripts") pod "ee871f13-1e33-47c7-aad9-849657a07bbb" (UID: "ee871f13-1e33-47c7-aad9-849657a07bbb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.340276 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-p8xwm" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.341446 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e2e5b54-c75b-4d58-aff1-ea98ac2f6dd2-kube-api-access-p7dz7" (OuterVolumeSpecName: "kube-api-access-p7dz7") pod "7e2e5b54-c75b-4d58-aff1-ea98ac2f6dd2" (UID: "7e2e5b54-c75b-4d58-aff1-ea98ac2f6dd2"). InnerVolumeSpecName "kube-api-access-p7dz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.342070 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee871f13-1e33-47c7-aad9-849657a07bbb-kube-api-access-pdqgm" (OuterVolumeSpecName: "kube-api-access-pdqgm") pod "ee871f13-1e33-47c7-aad9-849657a07bbb" (UID: "ee871f13-1e33-47c7-aad9-849657a07bbb"). InnerVolumeSpecName "kube-api-access-pdqgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.438179 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrdjf\" (UniqueName: \"kubernetes.io/projected/ab5d00f4-8395-4b1b-af14-687bbb9071c5-kube-api-access-hrdjf\") pod \"ab5d00f4-8395-4b1b-af14-687bbb9071c5\" (UID: \"ab5d00f4-8395-4b1b-af14-687bbb9071c5\") " Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.438487 4825 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ee871f13-1e33-47c7-aad9-849657a07bbb-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.438503 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdqgm\" (UniqueName: \"kubernetes.io/projected/ee871f13-1e33-47c7-aad9-849657a07bbb-kube-api-access-pdqgm\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.438515 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee871f13-1e33-47c7-aad9-849657a07bbb-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.438526 4825 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ee871f13-1e33-47c7-aad9-849657a07bbb-var-run\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.438535 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7dz7\" (UniqueName: \"kubernetes.io/projected/7e2e5b54-c75b-4d58-aff1-ea98ac2f6dd2-kube-api-access-p7dz7\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.441683 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab5d00f4-8395-4b1b-af14-687bbb9071c5-kube-api-access-hrdjf" (OuterVolumeSpecName: "kube-api-access-hrdjf") pod "ab5d00f4-8395-4b1b-af14-687bbb9071c5" (UID: "ab5d00f4-8395-4b1b-af14-687bbb9071c5"). InnerVolumeSpecName "kube-api-access-hrdjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.539917 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrdjf\" (UniqueName: \"kubernetes.io/projected/ab5d00f4-8395-4b1b-af14-687bbb9071c5-kube-api-access-hrdjf\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.885997 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-9vwm4" event={"ID":"41179ae7-4ff4-4c39-81d2-8867a63917e6","Type":"ContainerDied","Data":"13cff80defa57c1f5a12ec452bdf650ee332ab4e77c95b41779a2d0911e6b085"} Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.886022 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-9vwm4" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.886046 4825 scope.go:117] "RemoveContainer" containerID="fbf4f01494b54defbea82af8c554f6165fc11ad711f24a704b0e02d1c1ebec19" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.890501 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f5sff" event={"ID":"7e2e5b54-c75b-4d58-aff1-ea98ac2f6dd2","Type":"ContainerDied","Data":"1e211c81f69af3dbe21d98da32ec9788cba30c311dbacbd40326200f2db02d3d"} Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.890535 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e211c81f69af3dbe21d98da32ec9788cba30c311dbacbd40326200f2db02d3d" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.890582 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f5sff" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.895310 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-p8xwm" event={"ID":"ab5d00f4-8395-4b1b-af14-687bbb9071c5","Type":"ContainerDied","Data":"767d13c4e96753cb0132d818a62599454b7e2431fb63185bb362aa067cd540e3"} Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.895348 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="767d13c4e96753cb0132d818a62599454b7e2431fb63185bb362aa067cd540e3" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.895402 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-p8xwm" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.903320 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mqtlv-config-7vc5g" event={"ID":"ee871f13-1e33-47c7-aad9-849657a07bbb","Type":"ContainerDied","Data":"84cd778a680a4eed01cbff5d1f1a9ff1097b8409913b7f355f2ae5b28b022fb4"} Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.903342 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mqtlv-config-7vc5g" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.903360 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84cd778a680a4eed01cbff5d1f1a9ff1097b8409913b7f355f2ae5b28b022fb4" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.910435 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-mqtlv-config-7vc5g"] Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.924702 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-mqtlv-config-7vc5g"] Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.943188 4825 scope.go:117] "RemoveContainer" containerID="b6d0ee339a8dd3f05512b7bd310db99d28a1a32a79e11ab560300e53815b4c7e" Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.949355 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-9vwm4"] Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.955666 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-9vwm4"] Oct 07 19:17:50 crc kubenswrapper[4825]: I1007 19:17:50.955809 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-mqtlv" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.011746 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mqtlv-config-79xqb"] Oct 07 19:17:51 crc kubenswrapper[4825]: E1007 19:17:51.013085 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee871f13-1e33-47c7-aad9-849657a07bbb" containerName="ovn-config" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.013100 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee871f13-1e33-47c7-aad9-849657a07bbb" containerName="ovn-config" Oct 07 19:17:51 crc kubenswrapper[4825]: E1007 19:17:51.013112 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41179ae7-4ff4-4c39-81d2-8867a63917e6" containerName="init" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.013118 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="41179ae7-4ff4-4c39-81d2-8867a63917e6" containerName="init" Oct 07 19:17:51 crc kubenswrapper[4825]: E1007 19:17:51.013127 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41179ae7-4ff4-4c39-81d2-8867a63917e6" containerName="dnsmasq-dns" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.013133 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="41179ae7-4ff4-4c39-81d2-8867a63917e6" containerName="dnsmasq-dns" Oct 07 19:17:51 crc kubenswrapper[4825]: E1007 19:17:51.013154 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2e5b54-c75b-4d58-aff1-ea98ac2f6dd2" containerName="mariadb-database-create" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.013160 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2e5b54-c75b-4d58-aff1-ea98ac2f6dd2" containerName="mariadb-database-create" Oct 07 19:17:51 crc kubenswrapper[4825]: E1007 19:17:51.013173 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab5d00f4-8395-4b1b-af14-687bbb9071c5" containerName="mariadb-database-create" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.013178 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab5d00f4-8395-4b1b-af14-687bbb9071c5" containerName="mariadb-database-create" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.013337 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e2e5b54-c75b-4d58-aff1-ea98ac2f6dd2" containerName="mariadb-database-create" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.013355 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="41179ae7-4ff4-4c39-81d2-8867a63917e6" containerName="dnsmasq-dns" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.013370 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee871f13-1e33-47c7-aad9-849657a07bbb" containerName="ovn-config" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.013376 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab5d00f4-8395-4b1b-af14-687bbb9071c5" containerName="mariadb-database-create" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.013863 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mqtlv-config-79xqb" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.019416 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.045640 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mqtlv-config-79xqb"] Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.150456 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34059d97-cef2-4f86-bfa4-cc719f9bb712-scripts\") pod \"ovn-controller-mqtlv-config-79xqb\" (UID: \"34059d97-cef2-4f86-bfa4-cc719f9bb712\") " pod="openstack/ovn-controller-mqtlv-config-79xqb" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.150517 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zz6d\" (UniqueName: \"kubernetes.io/projected/34059d97-cef2-4f86-bfa4-cc719f9bb712-kube-api-access-9zz6d\") pod \"ovn-controller-mqtlv-config-79xqb\" (UID: \"34059d97-cef2-4f86-bfa4-cc719f9bb712\") " pod="openstack/ovn-controller-mqtlv-config-79xqb" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.150638 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/34059d97-cef2-4f86-bfa4-cc719f9bb712-var-run-ovn\") pod \"ovn-controller-mqtlv-config-79xqb\" (UID: \"34059d97-cef2-4f86-bfa4-cc719f9bb712\") " pod="openstack/ovn-controller-mqtlv-config-79xqb" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.150685 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/34059d97-cef2-4f86-bfa4-cc719f9bb712-var-log-ovn\") pod \"ovn-controller-mqtlv-config-79xqb\" (UID: \"34059d97-cef2-4f86-bfa4-cc719f9bb712\") " pod="openstack/ovn-controller-mqtlv-config-79xqb" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.150726 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/34059d97-cef2-4f86-bfa4-cc719f9bb712-additional-scripts\") pod \"ovn-controller-mqtlv-config-79xqb\" (UID: \"34059d97-cef2-4f86-bfa4-cc719f9bb712\") " pod="openstack/ovn-controller-mqtlv-config-79xqb" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.150749 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/34059d97-cef2-4f86-bfa4-cc719f9bb712-var-run\") pod \"ovn-controller-mqtlv-config-79xqb\" (UID: \"34059d97-cef2-4f86-bfa4-cc719f9bb712\") " pod="openstack/ovn-controller-mqtlv-config-79xqb" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.252590 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/34059d97-cef2-4f86-bfa4-cc719f9bb712-var-run-ovn\") pod \"ovn-controller-mqtlv-config-79xqb\" (UID: \"34059d97-cef2-4f86-bfa4-cc719f9bb712\") " pod="openstack/ovn-controller-mqtlv-config-79xqb" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.252671 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/34059d97-cef2-4f86-bfa4-cc719f9bb712-var-log-ovn\") pod \"ovn-controller-mqtlv-config-79xqb\" (UID: \"34059d97-cef2-4f86-bfa4-cc719f9bb712\") " pod="openstack/ovn-controller-mqtlv-config-79xqb" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.252714 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/34059d97-cef2-4f86-bfa4-cc719f9bb712-additional-scripts\") pod \"ovn-controller-mqtlv-config-79xqb\" (UID: \"34059d97-cef2-4f86-bfa4-cc719f9bb712\") " pod="openstack/ovn-controller-mqtlv-config-79xqb" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.252740 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/34059d97-cef2-4f86-bfa4-cc719f9bb712-var-run\") pod \"ovn-controller-mqtlv-config-79xqb\" (UID: \"34059d97-cef2-4f86-bfa4-cc719f9bb712\") " pod="openstack/ovn-controller-mqtlv-config-79xqb" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.252790 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34059d97-cef2-4f86-bfa4-cc719f9bb712-scripts\") pod \"ovn-controller-mqtlv-config-79xqb\" (UID: \"34059d97-cef2-4f86-bfa4-cc719f9bb712\") " pod="openstack/ovn-controller-mqtlv-config-79xqb" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.252816 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zz6d\" (UniqueName: \"kubernetes.io/projected/34059d97-cef2-4f86-bfa4-cc719f9bb712-kube-api-access-9zz6d\") pod \"ovn-controller-mqtlv-config-79xqb\" (UID: \"34059d97-cef2-4f86-bfa4-cc719f9bb712\") " pod="openstack/ovn-controller-mqtlv-config-79xqb" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.252819 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/34059d97-cef2-4f86-bfa4-cc719f9bb712-var-run-ovn\") pod \"ovn-controller-mqtlv-config-79xqb\" (UID: \"34059d97-cef2-4f86-bfa4-cc719f9bb712\") " pod="openstack/ovn-controller-mqtlv-config-79xqb" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.252878 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/34059d97-cef2-4f86-bfa4-cc719f9bb712-var-log-ovn\") pod \"ovn-controller-mqtlv-config-79xqb\" (UID: \"34059d97-cef2-4f86-bfa4-cc719f9bb712\") " pod="openstack/ovn-controller-mqtlv-config-79xqb" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.252887 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/34059d97-cef2-4f86-bfa4-cc719f9bb712-var-run\") pod \"ovn-controller-mqtlv-config-79xqb\" (UID: \"34059d97-cef2-4f86-bfa4-cc719f9bb712\") " pod="openstack/ovn-controller-mqtlv-config-79xqb" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.253643 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/34059d97-cef2-4f86-bfa4-cc719f9bb712-additional-scripts\") pod \"ovn-controller-mqtlv-config-79xqb\" (UID: \"34059d97-cef2-4f86-bfa4-cc719f9bb712\") " pod="openstack/ovn-controller-mqtlv-config-79xqb" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.255260 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34059d97-cef2-4f86-bfa4-cc719f9bb712-scripts\") pod \"ovn-controller-mqtlv-config-79xqb\" (UID: \"34059d97-cef2-4f86-bfa4-cc719f9bb712\") " pod="openstack/ovn-controller-mqtlv-config-79xqb" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.271127 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zz6d\" (UniqueName: \"kubernetes.io/projected/34059d97-cef2-4f86-bfa4-cc719f9bb712-kube-api-access-9zz6d\") pod \"ovn-controller-mqtlv-config-79xqb\" (UID: \"34059d97-cef2-4f86-bfa4-cc719f9bb712\") " pod="openstack/ovn-controller-mqtlv-config-79xqb" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.321335 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hrjbr" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.342100 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mqtlv-config-79xqb" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.457172 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64bwf\" (UniqueName: \"kubernetes.io/projected/ac31bb14-31ca-44cc-9e94-6af59e03a578-kube-api-access-64bwf\") pod \"ac31bb14-31ca-44cc-9e94-6af59e03a578\" (UID: \"ac31bb14-31ca-44cc-9e94-6af59e03a578\") " Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.462328 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac31bb14-31ca-44cc-9e94-6af59e03a578-kube-api-access-64bwf" (OuterVolumeSpecName: "kube-api-access-64bwf") pod "ac31bb14-31ca-44cc-9e94-6af59e03a578" (UID: "ac31bb14-31ca-44cc-9e94-6af59e03a578"). InnerVolumeSpecName "kube-api-access-64bwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.566129 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64bwf\" (UniqueName: \"kubernetes.io/projected/ac31bb14-31ca-44cc-9e94-6af59e03a578-kube-api-access-64bwf\") on node \"crc\" DevicePath \"\"" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.787863 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mqtlv-config-79xqb"] Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.823846 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41179ae7-4ff4-4c39-81d2-8867a63917e6" path="/var/lib/kubelet/pods/41179ae7-4ff4-4c39-81d2-8867a63917e6/volumes" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.827973 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee871f13-1e33-47c7-aad9-849657a07bbb" path="/var/lib/kubelet/pods/ee871f13-1e33-47c7-aad9-849657a07bbb/volumes" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.914927 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hrjbr" event={"ID":"ac31bb14-31ca-44cc-9e94-6af59e03a578","Type":"ContainerDied","Data":"db624b877ed6d5db20cf7a39f12cb7cd43174e4cc587d31fefb59e7a84de6611"} Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.914974 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db624b877ed6d5db20cf7a39f12cb7cd43174e4cc587d31fefb59e7a84de6611" Oct 07 19:17:51 crc kubenswrapper[4825]: I1007 19:17:51.915053 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hrjbr" Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.463019 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-4c09-account-create-bgzj4"] Oct 07 19:17:57 crc kubenswrapper[4825]: E1007 19:17:57.464155 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac31bb14-31ca-44cc-9e94-6af59e03a578" containerName="mariadb-database-create" Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.464176 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac31bb14-31ca-44cc-9e94-6af59e03a578" containerName="mariadb-database-create" Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.464494 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac31bb14-31ca-44cc-9e94-6af59e03a578" containerName="mariadb-database-create" Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.465302 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4c09-account-create-bgzj4" Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.467769 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.468936 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4c09-account-create-bgzj4"] Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.491787 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rckbm\" (UniqueName: \"kubernetes.io/projected/fed9dc70-b8c0-434e-9ee3-68c176c29362-kube-api-access-rckbm\") pod \"barbican-4c09-account-create-bgzj4\" (UID: \"fed9dc70-b8c0-434e-9ee3-68c176c29362\") " pod="openstack/barbican-4c09-account-create-bgzj4" Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.550633 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-3398-account-create-xz4rk"] Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.555784 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3398-account-create-xz4rk" Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.557570 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.561085 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3398-account-create-xz4rk"] Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.593055 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rckbm\" (UniqueName: \"kubernetes.io/projected/fed9dc70-b8c0-434e-9ee3-68c176c29362-kube-api-access-rckbm\") pod \"barbican-4c09-account-create-bgzj4\" (UID: \"fed9dc70-b8c0-434e-9ee3-68c176c29362\") " pod="openstack/barbican-4c09-account-create-bgzj4" Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.593241 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp24p\" (UniqueName: \"kubernetes.io/projected/13f685da-b7bf-4718-b45c-7c19a681de56-kube-api-access-tp24p\") pod \"cinder-3398-account-create-xz4rk\" (UID: \"13f685da-b7bf-4718-b45c-7c19a681de56\") " pod="openstack/cinder-3398-account-create-xz4rk" Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.611425 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rckbm\" (UniqueName: \"kubernetes.io/projected/fed9dc70-b8c0-434e-9ee3-68c176c29362-kube-api-access-rckbm\") pod \"barbican-4c09-account-create-bgzj4\" (UID: \"fed9dc70-b8c0-434e-9ee3-68c176c29362\") " pod="openstack/barbican-4c09-account-create-bgzj4" Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.694001 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp24p\" (UniqueName: \"kubernetes.io/projected/13f685da-b7bf-4718-b45c-7c19a681de56-kube-api-access-tp24p\") pod \"cinder-3398-account-create-xz4rk\" (UID: \"13f685da-b7bf-4718-b45c-7c19a681de56\") " pod="openstack/cinder-3398-account-create-xz4rk" Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.710640 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp24p\" (UniqueName: \"kubernetes.io/projected/13f685da-b7bf-4718-b45c-7c19a681de56-kube-api-access-tp24p\") pod \"cinder-3398-account-create-xz4rk\" (UID: \"13f685da-b7bf-4718-b45c-7c19a681de56\") " pod="openstack/cinder-3398-account-create-xz4rk" Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.754013 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-327b-account-create-rmxxh"] Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.755954 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-327b-account-create-rmxxh" Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.758833 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.767570 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-327b-account-create-rmxxh"] Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.787933 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4c09-account-create-bgzj4" Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.795447 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqhcc\" (UniqueName: \"kubernetes.io/projected/9661376c-9487-4ac1-af61-e4b4f846f554-kube-api-access-zqhcc\") pod \"neutron-327b-account-create-rmxxh\" (UID: \"9661376c-9487-4ac1-af61-e4b4f846f554\") " pod="openstack/neutron-327b-account-create-rmxxh" Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.874832 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3398-account-create-xz4rk" Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.898950 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqhcc\" (UniqueName: \"kubernetes.io/projected/9661376c-9487-4ac1-af61-e4b4f846f554-kube-api-access-zqhcc\") pod \"neutron-327b-account-create-rmxxh\" (UID: \"9661376c-9487-4ac1-af61-e4b4f846f554\") " pod="openstack/neutron-327b-account-create-rmxxh" Oct 07 19:17:57 crc kubenswrapper[4825]: I1007 19:17:57.920254 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqhcc\" (UniqueName: \"kubernetes.io/projected/9661376c-9487-4ac1-af61-e4b4f846f554-kube-api-access-zqhcc\") pod \"neutron-327b-account-create-rmxxh\" (UID: \"9661376c-9487-4ac1-af61-e4b4f846f554\") " pod="openstack/neutron-327b-account-create-rmxxh" Oct 07 19:17:58 crc kubenswrapper[4825]: I1007 19:17:58.080654 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-327b-account-create-rmxxh" Oct 07 19:18:00 crc kubenswrapper[4825]: W1007 19:18:00.972981 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34059d97_cef2_4f86_bfa4_cc719f9bb712.slice/crio-47b6febc4f5364a1a8f0cd84a25b7ef77a068302c4cb9bc6ee693077111bc6e2 WatchSource:0}: Error finding container 47b6febc4f5364a1a8f0cd84a25b7ef77a068302c4cb9bc6ee693077111bc6e2: Status 404 returned error can't find the container with id 47b6febc4f5364a1a8f0cd84a25b7ef77a068302c4cb9bc6ee693077111bc6e2 Oct 07 19:18:01 crc kubenswrapper[4825]: I1007 19:18:01.004238 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mqtlv-config-79xqb" event={"ID":"34059d97-cef2-4f86-bfa4-cc719f9bb712","Type":"ContainerStarted","Data":"47b6febc4f5364a1a8f0cd84a25b7ef77a068302c4cb9bc6ee693077111bc6e2"} Oct 07 19:18:01 crc kubenswrapper[4825]: I1007 19:18:01.558777 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-327b-account-create-rmxxh"] Oct 07 19:18:01 crc kubenswrapper[4825]: I1007 19:18:01.571168 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3398-account-create-xz4rk"] Oct 07 19:18:01 crc kubenswrapper[4825]: I1007 19:18:01.644911 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4c09-account-create-bgzj4"] Oct 07 19:18:02 crc kubenswrapper[4825]: I1007 19:18:02.012715 4825 generic.go:334] "Generic (PLEG): container finished" podID="13f685da-b7bf-4718-b45c-7c19a681de56" containerID="ca0169795172599287bb6168937113feecdb5f27e5899e43d0cab81f82c5f322" exitCode=0 Oct 07 19:18:02 crc kubenswrapper[4825]: I1007 19:18:02.012776 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3398-account-create-xz4rk" event={"ID":"13f685da-b7bf-4718-b45c-7c19a681de56","Type":"ContainerDied","Data":"ca0169795172599287bb6168937113feecdb5f27e5899e43d0cab81f82c5f322"} Oct 07 19:18:02 crc kubenswrapper[4825]: I1007 19:18:02.012801 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3398-account-create-xz4rk" event={"ID":"13f685da-b7bf-4718-b45c-7c19a681de56","Type":"ContainerStarted","Data":"bdc7e0363d7b27e0c47c3e23d21379696652eb57f52ac4c5cd4eedcac2ccfedf"} Oct 07 19:18:02 crc kubenswrapper[4825]: I1007 19:18:02.015145 4825 generic.go:334] "Generic (PLEG): container finished" podID="9661376c-9487-4ac1-af61-e4b4f846f554" containerID="d3c07cff373a2370fbfc0ceac515737fda186b40a16e7c6e0a6be4db38f31f2c" exitCode=0 Oct 07 19:18:02 crc kubenswrapper[4825]: I1007 19:18:02.015184 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-327b-account-create-rmxxh" event={"ID":"9661376c-9487-4ac1-af61-e4b4f846f554","Type":"ContainerDied","Data":"d3c07cff373a2370fbfc0ceac515737fda186b40a16e7c6e0a6be4db38f31f2c"} Oct 07 19:18:02 crc kubenswrapper[4825]: I1007 19:18:02.015199 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-327b-account-create-rmxxh" event={"ID":"9661376c-9487-4ac1-af61-e4b4f846f554","Type":"ContainerStarted","Data":"b2d3bd5df0aee0d28396bdda270f2919aa55c725fa3b17c58fc58cf2be940ca2"} Oct 07 19:18:02 crc kubenswrapper[4825]: I1007 19:18:02.018465 4825 generic.go:334] "Generic (PLEG): container finished" podID="34059d97-cef2-4f86-bfa4-cc719f9bb712" containerID="e99899481fd0f3aabd7404eeb522dae7f72fe8e8eec6445f5a136a9621e0ccca" exitCode=0 Oct 07 19:18:02 crc kubenswrapper[4825]: I1007 19:18:02.018508 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mqtlv-config-79xqb" event={"ID":"34059d97-cef2-4f86-bfa4-cc719f9bb712","Type":"ContainerDied","Data":"e99899481fd0f3aabd7404eeb522dae7f72fe8e8eec6445f5a136a9621e0ccca"} Oct 07 19:18:02 crc kubenswrapper[4825]: I1007 19:18:02.026987 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-cw4gf" event={"ID":"e981b526-0afb-4a9c-ba89-fe87728f4603","Type":"ContainerStarted","Data":"2db6a6ec54cde59acb7506828b34a2ca3ab8872828ab4031ac5dfafa166331c6"} Oct 07 19:18:02 crc kubenswrapper[4825]: I1007 19:18:02.055503 4825 generic.go:334] "Generic (PLEG): container finished" podID="fed9dc70-b8c0-434e-9ee3-68c176c29362" containerID="b3e3f43b9a08835f74a4535947ef7941a4bf08570c4c15e8f2014da54c998f04" exitCode=0 Oct 07 19:18:02 crc kubenswrapper[4825]: I1007 19:18:02.055678 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4c09-account-create-bgzj4" event={"ID":"fed9dc70-b8c0-434e-9ee3-68c176c29362","Type":"ContainerDied","Data":"b3e3f43b9a08835f74a4535947ef7941a4bf08570c4c15e8f2014da54c998f04"} Oct 07 19:18:02 crc kubenswrapper[4825]: I1007 19:18:02.055705 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4c09-account-create-bgzj4" event={"ID":"fed9dc70-b8c0-434e-9ee3-68c176c29362","Type":"ContainerStarted","Data":"9631a12ba08bda9b5bf955ddf3a1fb9e62c49ee013e293d2326c506ef13e2e51"} Oct 07 19:18:02 crc kubenswrapper[4825]: I1007 19:18:02.059706 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rnvqv" event={"ID":"c38775cb-8a0d-4834-8215-af35a9fd4952","Type":"ContainerStarted","Data":"b4a7f889fdf763b8db72d95639951da308ba33e2bdd9e991b334942643daadb4"} Oct 07 19:18:02 crc kubenswrapper[4825]: I1007 19:18:02.080833 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-cw4gf" podStartSLOduration=1.873885934 podStartE2EDuration="16.080774801s" podCreationTimestamp="2025-10-07 19:17:46 +0000 UTC" firstStartedPulling="2025-10-07 19:17:46.967802839 +0000 UTC m=+1055.789841486" lastFinishedPulling="2025-10-07 19:18:01.174691716 +0000 UTC m=+1069.996730353" observedRunningTime="2025-10-07 19:18:02.074058245 +0000 UTC m=+1070.896096902" watchObservedRunningTime="2025-10-07 19:18:02.080774801 +0000 UTC m=+1070.902813438" Oct 07 19:18:02 crc kubenswrapper[4825]: I1007 19:18:02.112441 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-rnvqv" podStartSLOduration=2.615809245 podStartE2EDuration="15.112389843s" podCreationTimestamp="2025-10-07 19:17:47 +0000 UTC" firstStartedPulling="2025-10-07 19:17:48.589885264 +0000 UTC m=+1057.411923901" lastFinishedPulling="2025-10-07 19:18:01.086465852 +0000 UTC m=+1069.908504499" observedRunningTime="2025-10-07 19:18:02.107806926 +0000 UTC m=+1070.929845563" watchObservedRunningTime="2025-10-07 19:18:02.112389843 +0000 UTC m=+1070.934428480" Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.619785 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mqtlv-config-79xqb" Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.630873 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-327b-account-create-rmxxh" Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.639284 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4c09-account-create-bgzj4" Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.647930 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3398-account-create-xz4rk" Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.703502 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqhcc\" (UniqueName: \"kubernetes.io/projected/9661376c-9487-4ac1-af61-e4b4f846f554-kube-api-access-zqhcc\") pod \"9661376c-9487-4ac1-af61-e4b4f846f554\" (UID: \"9661376c-9487-4ac1-af61-e4b4f846f554\") " Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.703580 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/34059d97-cef2-4f86-bfa4-cc719f9bb712-var-run\") pod \"34059d97-cef2-4f86-bfa4-cc719f9bb712\" (UID: \"34059d97-cef2-4f86-bfa4-cc719f9bb712\") " Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.703672 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/34059d97-cef2-4f86-bfa4-cc719f9bb712-var-log-ovn\") pod \"34059d97-cef2-4f86-bfa4-cc719f9bb712\" (UID: \"34059d97-cef2-4f86-bfa4-cc719f9bb712\") " Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.703695 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/34059d97-cef2-4f86-bfa4-cc719f9bb712-additional-scripts\") pod \"34059d97-cef2-4f86-bfa4-cc719f9bb712\" (UID: \"34059d97-cef2-4f86-bfa4-cc719f9bb712\") " Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.703715 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rckbm\" (UniqueName: \"kubernetes.io/projected/fed9dc70-b8c0-434e-9ee3-68c176c29362-kube-api-access-rckbm\") pod \"fed9dc70-b8c0-434e-9ee3-68c176c29362\" (UID: \"fed9dc70-b8c0-434e-9ee3-68c176c29362\") " Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.703734 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34059d97-cef2-4f86-bfa4-cc719f9bb712-scripts\") pod \"34059d97-cef2-4f86-bfa4-cc719f9bb712\" (UID: \"34059d97-cef2-4f86-bfa4-cc719f9bb712\") " Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.703765 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/34059d97-cef2-4f86-bfa4-cc719f9bb712-var-run-ovn\") pod \"34059d97-cef2-4f86-bfa4-cc719f9bb712\" (UID: \"34059d97-cef2-4f86-bfa4-cc719f9bb712\") " Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.703800 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp24p\" (UniqueName: \"kubernetes.io/projected/13f685da-b7bf-4718-b45c-7c19a681de56-kube-api-access-tp24p\") pod \"13f685da-b7bf-4718-b45c-7c19a681de56\" (UID: \"13f685da-b7bf-4718-b45c-7c19a681de56\") " Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.703715 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34059d97-cef2-4f86-bfa4-cc719f9bb712-var-run" (OuterVolumeSpecName: "var-run") pod "34059d97-cef2-4f86-bfa4-cc719f9bb712" (UID: "34059d97-cef2-4f86-bfa4-cc719f9bb712"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.703840 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34059d97-cef2-4f86-bfa4-cc719f9bb712-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "34059d97-cef2-4f86-bfa4-cc719f9bb712" (UID: "34059d97-cef2-4f86-bfa4-cc719f9bb712"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.703859 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34059d97-cef2-4f86-bfa4-cc719f9bb712-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "34059d97-cef2-4f86-bfa4-cc719f9bb712" (UID: "34059d97-cef2-4f86-bfa4-cc719f9bb712"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.704637 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34059d97-cef2-4f86-bfa4-cc719f9bb712-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "34059d97-cef2-4f86-bfa4-cc719f9bb712" (UID: "34059d97-cef2-4f86-bfa4-cc719f9bb712"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.706521 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34059d97-cef2-4f86-bfa4-cc719f9bb712-scripts" (OuterVolumeSpecName: "scripts") pod "34059d97-cef2-4f86-bfa4-cc719f9bb712" (UID: "34059d97-cef2-4f86-bfa4-cc719f9bb712"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.706651 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zz6d\" (UniqueName: \"kubernetes.io/projected/34059d97-cef2-4f86-bfa4-cc719f9bb712-kube-api-access-9zz6d\") pod \"34059d97-cef2-4f86-bfa4-cc719f9bb712\" (UID: \"34059d97-cef2-4f86-bfa4-cc719f9bb712\") " Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.707402 4825 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/34059d97-cef2-4f86-bfa4-cc719f9bb712-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.707421 4825 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/34059d97-cef2-4f86-bfa4-cc719f9bb712-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.707431 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34059d97-cef2-4f86-bfa4-cc719f9bb712-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.707441 4825 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/34059d97-cef2-4f86-bfa4-cc719f9bb712-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.707450 4825 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/34059d97-cef2-4f86-bfa4-cc719f9bb712-var-run\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.710050 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9661376c-9487-4ac1-af61-e4b4f846f554-kube-api-access-zqhcc" (OuterVolumeSpecName: "kube-api-access-zqhcc") pod "9661376c-9487-4ac1-af61-e4b4f846f554" (UID: "9661376c-9487-4ac1-af61-e4b4f846f554"). InnerVolumeSpecName "kube-api-access-zqhcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.710629 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fed9dc70-b8c0-434e-9ee3-68c176c29362-kube-api-access-rckbm" (OuterVolumeSpecName: "kube-api-access-rckbm") pod "fed9dc70-b8c0-434e-9ee3-68c176c29362" (UID: "fed9dc70-b8c0-434e-9ee3-68c176c29362"). InnerVolumeSpecName "kube-api-access-rckbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.711487 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13f685da-b7bf-4718-b45c-7c19a681de56-kube-api-access-tp24p" (OuterVolumeSpecName: "kube-api-access-tp24p") pod "13f685da-b7bf-4718-b45c-7c19a681de56" (UID: "13f685da-b7bf-4718-b45c-7c19a681de56"). InnerVolumeSpecName "kube-api-access-tp24p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.714410 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34059d97-cef2-4f86-bfa4-cc719f9bb712-kube-api-access-9zz6d" (OuterVolumeSpecName: "kube-api-access-9zz6d") pod "34059d97-cef2-4f86-bfa4-cc719f9bb712" (UID: "34059d97-cef2-4f86-bfa4-cc719f9bb712"). InnerVolumeSpecName "kube-api-access-9zz6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.809213 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp24p\" (UniqueName: \"kubernetes.io/projected/13f685da-b7bf-4718-b45c-7c19a681de56-kube-api-access-tp24p\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.809335 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zz6d\" (UniqueName: \"kubernetes.io/projected/34059d97-cef2-4f86-bfa4-cc719f9bb712-kube-api-access-9zz6d\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.809358 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqhcc\" (UniqueName: \"kubernetes.io/projected/9661376c-9487-4ac1-af61-e4b4f846f554-kube-api-access-zqhcc\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:03 crc kubenswrapper[4825]: I1007 19:18:03.809377 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rckbm\" (UniqueName: \"kubernetes.io/projected/fed9dc70-b8c0-434e-9ee3-68c176c29362-kube-api-access-rckbm\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:04 crc kubenswrapper[4825]: I1007 19:18:04.083799 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4c09-account-create-bgzj4" event={"ID":"fed9dc70-b8c0-434e-9ee3-68c176c29362","Type":"ContainerDied","Data":"9631a12ba08bda9b5bf955ddf3a1fb9e62c49ee013e293d2326c506ef13e2e51"} Oct 07 19:18:04 crc kubenswrapper[4825]: I1007 19:18:04.083851 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9631a12ba08bda9b5bf955ddf3a1fb9e62c49ee013e293d2326c506ef13e2e51" Oct 07 19:18:04 crc kubenswrapper[4825]: I1007 19:18:04.083810 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4c09-account-create-bgzj4" Oct 07 19:18:04 crc kubenswrapper[4825]: I1007 19:18:04.086072 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3398-account-create-xz4rk" Oct 07 19:18:04 crc kubenswrapper[4825]: I1007 19:18:04.086082 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3398-account-create-xz4rk" event={"ID":"13f685da-b7bf-4718-b45c-7c19a681de56","Type":"ContainerDied","Data":"bdc7e0363d7b27e0c47c3e23d21379696652eb57f52ac4c5cd4eedcac2ccfedf"} Oct 07 19:18:04 crc kubenswrapper[4825]: I1007 19:18:04.086174 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdc7e0363d7b27e0c47c3e23d21379696652eb57f52ac4c5cd4eedcac2ccfedf" Oct 07 19:18:04 crc kubenswrapper[4825]: I1007 19:18:04.087792 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-327b-account-create-rmxxh" Oct 07 19:18:04 crc kubenswrapper[4825]: I1007 19:18:04.087802 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-327b-account-create-rmxxh" event={"ID":"9661376c-9487-4ac1-af61-e4b4f846f554","Type":"ContainerDied","Data":"b2d3bd5df0aee0d28396bdda270f2919aa55c725fa3b17c58fc58cf2be940ca2"} Oct 07 19:18:04 crc kubenswrapper[4825]: I1007 19:18:04.087840 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2d3bd5df0aee0d28396bdda270f2919aa55c725fa3b17c58fc58cf2be940ca2" Oct 07 19:18:04 crc kubenswrapper[4825]: I1007 19:18:04.089650 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mqtlv-config-79xqb" event={"ID":"34059d97-cef2-4f86-bfa4-cc719f9bb712","Type":"ContainerDied","Data":"47b6febc4f5364a1a8f0cd84a25b7ef77a068302c4cb9bc6ee693077111bc6e2"} Oct 07 19:18:04 crc kubenswrapper[4825]: I1007 19:18:04.089677 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47b6febc4f5364a1a8f0cd84a25b7ef77a068302c4cb9bc6ee693077111bc6e2" Oct 07 19:18:04 crc kubenswrapper[4825]: I1007 19:18:04.089690 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mqtlv-config-79xqb" Oct 07 19:18:04 crc kubenswrapper[4825]: I1007 19:18:04.741401 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-mqtlv-config-79xqb"] Oct 07 19:18:04 crc kubenswrapper[4825]: I1007 19:18:04.754997 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-mqtlv-config-79xqb"] Oct 07 19:18:05 crc kubenswrapper[4825]: I1007 19:18:05.099422 4825 generic.go:334] "Generic (PLEG): container finished" podID="c38775cb-8a0d-4834-8215-af35a9fd4952" containerID="b4a7f889fdf763b8db72d95639951da308ba33e2bdd9e991b334942643daadb4" exitCode=0 Oct 07 19:18:05 crc kubenswrapper[4825]: I1007 19:18:05.099475 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rnvqv" event={"ID":"c38775cb-8a0d-4834-8215-af35a9fd4952","Type":"ContainerDied","Data":"b4a7f889fdf763b8db72d95639951da308ba33e2bdd9e991b334942643daadb4"} Oct 07 19:18:05 crc kubenswrapper[4825]: I1007 19:18:05.813866 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34059d97-cef2-4f86-bfa4-cc719f9bb712" path="/var/lib/kubelet/pods/34059d97-cef2-4f86-bfa4-cc719f9bb712/volumes" Oct 07 19:18:06 crc kubenswrapper[4825]: I1007 19:18:06.493717 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rnvqv" Oct 07 19:18:06 crc kubenswrapper[4825]: I1007 19:18:06.560157 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c38775cb-8a0d-4834-8215-af35a9fd4952-combined-ca-bundle\") pod \"c38775cb-8a0d-4834-8215-af35a9fd4952\" (UID: \"c38775cb-8a0d-4834-8215-af35a9fd4952\") " Oct 07 19:18:06 crc kubenswrapper[4825]: I1007 19:18:06.560273 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk6qd\" (UniqueName: \"kubernetes.io/projected/c38775cb-8a0d-4834-8215-af35a9fd4952-kube-api-access-bk6qd\") pod \"c38775cb-8a0d-4834-8215-af35a9fd4952\" (UID: \"c38775cb-8a0d-4834-8215-af35a9fd4952\") " Oct 07 19:18:06 crc kubenswrapper[4825]: I1007 19:18:06.560318 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c38775cb-8a0d-4834-8215-af35a9fd4952-config-data\") pod \"c38775cb-8a0d-4834-8215-af35a9fd4952\" (UID: \"c38775cb-8a0d-4834-8215-af35a9fd4952\") " Oct 07 19:18:06 crc kubenswrapper[4825]: I1007 19:18:06.567921 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c38775cb-8a0d-4834-8215-af35a9fd4952-kube-api-access-bk6qd" (OuterVolumeSpecName: "kube-api-access-bk6qd") pod "c38775cb-8a0d-4834-8215-af35a9fd4952" (UID: "c38775cb-8a0d-4834-8215-af35a9fd4952"). InnerVolumeSpecName "kube-api-access-bk6qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:18:06 crc kubenswrapper[4825]: I1007 19:18:06.591535 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c38775cb-8a0d-4834-8215-af35a9fd4952-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c38775cb-8a0d-4834-8215-af35a9fd4952" (UID: "c38775cb-8a0d-4834-8215-af35a9fd4952"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:06 crc kubenswrapper[4825]: I1007 19:18:06.609974 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c38775cb-8a0d-4834-8215-af35a9fd4952-config-data" (OuterVolumeSpecName: "config-data") pod "c38775cb-8a0d-4834-8215-af35a9fd4952" (UID: "c38775cb-8a0d-4834-8215-af35a9fd4952"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:06 crc kubenswrapper[4825]: I1007 19:18:06.662764 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c38775cb-8a0d-4834-8215-af35a9fd4952-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:06 crc kubenswrapper[4825]: I1007 19:18:06.663103 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk6qd\" (UniqueName: \"kubernetes.io/projected/c38775cb-8a0d-4834-8215-af35a9fd4952-kube-api-access-bk6qd\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:06 crc kubenswrapper[4825]: I1007 19:18:06.663196 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c38775cb-8a0d-4834-8215-af35a9fd4952-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.134043 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rnvqv" event={"ID":"c38775cb-8a0d-4834-8215-af35a9fd4952","Type":"ContainerDied","Data":"10e01eccdfe5a3457e823265571c43b0c55f6078e516cc140da060b55c868313"} Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.134446 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10e01eccdfe5a3457e823265571c43b0c55f6078e516cc140da060b55c868313" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.134477 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rnvqv" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.376663 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-7c55s"] Oct 07 19:18:07 crc kubenswrapper[4825]: E1007 19:18:07.377014 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13f685da-b7bf-4718-b45c-7c19a681de56" containerName="mariadb-account-create" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.377030 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="13f685da-b7bf-4718-b45c-7c19a681de56" containerName="mariadb-account-create" Oct 07 19:18:07 crc kubenswrapper[4825]: E1007 19:18:07.377039 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9661376c-9487-4ac1-af61-e4b4f846f554" containerName="mariadb-account-create" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.377046 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="9661376c-9487-4ac1-af61-e4b4f846f554" containerName="mariadb-account-create" Oct 07 19:18:07 crc kubenswrapper[4825]: E1007 19:18:07.377062 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34059d97-cef2-4f86-bfa4-cc719f9bb712" containerName="ovn-config" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.377068 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="34059d97-cef2-4f86-bfa4-cc719f9bb712" containerName="ovn-config" Oct 07 19:18:07 crc kubenswrapper[4825]: E1007 19:18:07.377082 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fed9dc70-b8c0-434e-9ee3-68c176c29362" containerName="mariadb-account-create" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.377088 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed9dc70-b8c0-434e-9ee3-68c176c29362" containerName="mariadb-account-create" Oct 07 19:18:07 crc kubenswrapper[4825]: E1007 19:18:07.377098 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c38775cb-8a0d-4834-8215-af35a9fd4952" containerName="keystone-db-sync" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.377103 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c38775cb-8a0d-4834-8215-af35a9fd4952" containerName="keystone-db-sync" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.377263 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="9661376c-9487-4ac1-af61-e4b4f846f554" containerName="mariadb-account-create" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.377273 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="fed9dc70-b8c0-434e-9ee3-68c176c29362" containerName="mariadb-account-create" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.377285 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="34059d97-cef2-4f86-bfa4-cc719f9bb712" containerName="ovn-config" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.377291 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="13f685da-b7bf-4718-b45c-7c19a681de56" containerName="mariadb-account-create" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.377310 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c38775cb-8a0d-4834-8215-af35a9fd4952" containerName="keystone-db-sync" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.378081 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-7c55s" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.441428 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-v2xq9"] Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.442782 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-v2xq9" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.444600 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.444982 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.447162 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-7c55s"] Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.447563 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.447824 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-w9rfd" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.453109 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-v2xq9"] Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.479493 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-config\") pod \"dnsmasq-dns-55fff446b9-7c55s\" (UID: \"a83e5e77-aee7-49b9-afb7-ce042e674026\") " pod="openstack/dnsmasq-dns-55fff446b9-7c55s" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.479577 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-fernet-keys\") pod \"keystone-bootstrap-v2xq9\" (UID: \"a5b707ea-501b-470e-8016-22c333a3f90a\") " pod="openstack/keystone-bootstrap-v2xq9" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.479622 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwn5s\" (UniqueName: \"kubernetes.io/projected/a5b707ea-501b-470e-8016-22c333a3f90a-kube-api-access-xwn5s\") pod \"keystone-bootstrap-v2xq9\" (UID: \"a5b707ea-501b-470e-8016-22c333a3f90a\") " pod="openstack/keystone-bootstrap-v2xq9" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.479649 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-combined-ca-bundle\") pod \"keystone-bootstrap-v2xq9\" (UID: \"a5b707ea-501b-470e-8016-22c333a3f90a\") " pod="openstack/keystone-bootstrap-v2xq9" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.479699 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsqpm\" (UniqueName: \"kubernetes.io/projected/a83e5e77-aee7-49b9-afb7-ce042e674026-kube-api-access-wsqpm\") pod \"dnsmasq-dns-55fff446b9-7c55s\" (UID: \"a83e5e77-aee7-49b9-afb7-ce042e674026\") " pod="openstack/dnsmasq-dns-55fff446b9-7c55s" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.479730 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-credential-keys\") pod \"keystone-bootstrap-v2xq9\" (UID: \"a5b707ea-501b-470e-8016-22c333a3f90a\") " pod="openstack/keystone-bootstrap-v2xq9" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.479752 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-7c55s\" (UID: \"a83e5e77-aee7-49b9-afb7-ce042e674026\") " pod="openstack/dnsmasq-dns-55fff446b9-7c55s" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.479776 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-7c55s\" (UID: \"a83e5e77-aee7-49b9-afb7-ce042e674026\") " pod="openstack/dnsmasq-dns-55fff446b9-7c55s" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.479799 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-7c55s\" (UID: \"a83e5e77-aee7-49b9-afb7-ce042e674026\") " pod="openstack/dnsmasq-dns-55fff446b9-7c55s" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.479834 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-scripts\") pod \"keystone-bootstrap-v2xq9\" (UID: \"a5b707ea-501b-470e-8016-22c333a3f90a\") " pod="openstack/keystone-bootstrap-v2xq9" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.479871 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-dns-svc\") pod \"dnsmasq-dns-55fff446b9-7c55s\" (UID: \"a83e5e77-aee7-49b9-afb7-ce042e674026\") " pod="openstack/dnsmasq-dns-55fff446b9-7c55s" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.479926 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-config-data\") pod \"keystone-bootstrap-v2xq9\" (UID: \"a5b707ea-501b-470e-8016-22c333a3f90a\") " pod="openstack/keystone-bootstrap-v2xq9" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.569869 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-54dd4bfd97-nctq4"] Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.571474 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54dd4bfd97-nctq4" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.574734 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.575755 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.575876 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.575970 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-mrd5w" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.580934 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwn5s\" (UniqueName: \"kubernetes.io/projected/a5b707ea-501b-470e-8016-22c333a3f90a-kube-api-access-xwn5s\") pod \"keystone-bootstrap-v2xq9\" (UID: \"a5b707ea-501b-470e-8016-22c333a3f90a\") " pod="openstack/keystone-bootstrap-v2xq9" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.580984 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-combined-ca-bundle\") pod \"keystone-bootstrap-v2xq9\" (UID: \"a5b707ea-501b-470e-8016-22c333a3f90a\") " pod="openstack/keystone-bootstrap-v2xq9" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.581048 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsqpm\" (UniqueName: \"kubernetes.io/projected/a83e5e77-aee7-49b9-afb7-ce042e674026-kube-api-access-wsqpm\") pod \"dnsmasq-dns-55fff446b9-7c55s\" (UID: \"a83e5e77-aee7-49b9-afb7-ce042e674026\") " pod="openstack/dnsmasq-dns-55fff446b9-7c55s" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.581072 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-credential-keys\") pod \"keystone-bootstrap-v2xq9\" (UID: \"a5b707ea-501b-470e-8016-22c333a3f90a\") " pod="openstack/keystone-bootstrap-v2xq9" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.581097 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-7c55s\" (UID: \"a83e5e77-aee7-49b9-afb7-ce042e674026\") " pod="openstack/dnsmasq-dns-55fff446b9-7c55s" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.581119 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-7c55s\" (UID: \"a83e5e77-aee7-49b9-afb7-ce042e674026\") " pod="openstack/dnsmasq-dns-55fff446b9-7c55s" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.581140 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-7c55s\" (UID: \"a83e5e77-aee7-49b9-afb7-ce042e674026\") " pod="openstack/dnsmasq-dns-55fff446b9-7c55s" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.581174 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-scripts\") pod \"keystone-bootstrap-v2xq9\" (UID: \"a5b707ea-501b-470e-8016-22c333a3f90a\") " pod="openstack/keystone-bootstrap-v2xq9" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.581212 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-dns-svc\") pod \"dnsmasq-dns-55fff446b9-7c55s\" (UID: \"a83e5e77-aee7-49b9-afb7-ce042e674026\") " pod="openstack/dnsmasq-dns-55fff446b9-7c55s" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.581296 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-config-data\") pod \"keystone-bootstrap-v2xq9\" (UID: \"a5b707ea-501b-470e-8016-22c333a3f90a\") " pod="openstack/keystone-bootstrap-v2xq9" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.581349 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-config\") pod \"dnsmasq-dns-55fff446b9-7c55s\" (UID: \"a83e5e77-aee7-49b9-afb7-ce042e674026\") " pod="openstack/dnsmasq-dns-55fff446b9-7c55s" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.581398 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-fernet-keys\") pod \"keystone-bootstrap-v2xq9\" (UID: \"a5b707ea-501b-470e-8016-22c333a3f90a\") " pod="openstack/keystone-bootstrap-v2xq9" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.582581 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-7c55s\" (UID: \"a83e5e77-aee7-49b9-afb7-ce042e674026\") " pod="openstack/dnsmasq-dns-55fff446b9-7c55s" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.587958 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-dns-svc\") pod \"dnsmasq-dns-55fff446b9-7c55s\" (UID: \"a83e5e77-aee7-49b9-afb7-ce042e674026\") " pod="openstack/dnsmasq-dns-55fff446b9-7c55s" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.591833 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-7c55s\" (UID: \"a83e5e77-aee7-49b9-afb7-ce042e674026\") " pod="openstack/dnsmasq-dns-55fff446b9-7c55s" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.598471 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-config\") pod \"dnsmasq-dns-55fff446b9-7c55s\" (UID: \"a83e5e77-aee7-49b9-afb7-ce042e674026\") " pod="openstack/dnsmasq-dns-55fff446b9-7c55s" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.601608 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-7c55s\" (UID: \"a83e5e77-aee7-49b9-afb7-ce042e674026\") " pod="openstack/dnsmasq-dns-55fff446b9-7c55s" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.605312 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-scripts\") pod \"keystone-bootstrap-v2xq9\" (UID: \"a5b707ea-501b-470e-8016-22c333a3f90a\") " pod="openstack/keystone-bootstrap-v2xq9" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.605903 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-combined-ca-bundle\") pod \"keystone-bootstrap-v2xq9\" (UID: \"a5b707ea-501b-470e-8016-22c333a3f90a\") " pod="openstack/keystone-bootstrap-v2xq9" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.607125 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-credential-keys\") pod \"keystone-bootstrap-v2xq9\" (UID: \"a5b707ea-501b-470e-8016-22c333a3f90a\") " pod="openstack/keystone-bootstrap-v2xq9" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.607222 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-config-data\") pod \"keystone-bootstrap-v2xq9\" (UID: \"a5b707ea-501b-470e-8016-22c333a3f90a\") " pod="openstack/keystone-bootstrap-v2xq9" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.607343 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-fernet-keys\") pod \"keystone-bootstrap-v2xq9\" (UID: \"a5b707ea-501b-470e-8016-22c333a3f90a\") " pod="openstack/keystone-bootstrap-v2xq9" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.613681 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsqpm\" (UniqueName: \"kubernetes.io/projected/a83e5e77-aee7-49b9-afb7-ce042e674026-kube-api-access-wsqpm\") pod \"dnsmasq-dns-55fff446b9-7c55s\" (UID: \"a83e5e77-aee7-49b9-afb7-ce042e674026\") " pod="openstack/dnsmasq-dns-55fff446b9-7c55s" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.615244 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54dd4bfd97-nctq4"] Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.620824 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwn5s\" (UniqueName: \"kubernetes.io/projected/a5b707ea-501b-470e-8016-22c333a3f90a-kube-api-access-xwn5s\") pod \"keystone-bootstrap-v2xq9\" (UID: \"a5b707ea-501b-470e-8016-22c333a3f90a\") " pod="openstack/keystone-bootstrap-v2xq9" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.683758 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-horizon-secret-key\") pod \"horizon-54dd4bfd97-nctq4\" (UID: \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\") " pod="openstack/horizon-54dd4bfd97-nctq4" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.683893 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-logs\") pod \"horizon-54dd4bfd97-nctq4\" (UID: \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\") " pod="openstack/horizon-54dd4bfd97-nctq4" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.683941 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-scripts\") pod \"horizon-54dd4bfd97-nctq4\" (UID: \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\") " pod="openstack/horizon-54dd4bfd97-nctq4" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.684000 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-config-data\") pod \"horizon-54dd4bfd97-nctq4\" (UID: \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\") " pod="openstack/horizon-54dd4bfd97-nctq4" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.684158 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqkr6\" (UniqueName: \"kubernetes.io/projected/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-kube-api-access-tqkr6\") pod \"horizon-54dd4bfd97-nctq4\" (UID: \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\") " pod="openstack/horizon-54dd4bfd97-nctq4" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.693006 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-7c55s" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.701284 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.703183 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.707383 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.712585 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.726314 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.761159 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-v2xq9" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.785584 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-logs\") pod \"horizon-54dd4bfd97-nctq4\" (UID: \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\") " pod="openstack/horizon-54dd4bfd97-nctq4" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.785646 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-log-httpd\") pod \"ceilometer-0\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " pod="openstack/ceilometer-0" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.785681 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-scripts\") pod \"horizon-54dd4bfd97-nctq4\" (UID: \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\") " pod="openstack/horizon-54dd4bfd97-nctq4" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.785727 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-config-data\") pod \"ceilometer-0\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " pod="openstack/ceilometer-0" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.785757 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-config-data\") pod \"horizon-54dd4bfd97-nctq4\" (UID: \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\") " pod="openstack/horizon-54dd4bfd97-nctq4" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.785777 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " pod="openstack/ceilometer-0" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.785814 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " pod="openstack/ceilometer-0" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.785848 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqkr6\" (UniqueName: \"kubernetes.io/projected/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-kube-api-access-tqkr6\") pod \"horizon-54dd4bfd97-nctq4\" (UID: \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\") " pod="openstack/horizon-54dd4bfd97-nctq4" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.785905 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-run-httpd\") pod \"ceilometer-0\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " pod="openstack/ceilometer-0" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.785925 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-scripts\") pod \"ceilometer-0\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " pod="openstack/ceilometer-0" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.785947 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-horizon-secret-key\") pod \"horizon-54dd4bfd97-nctq4\" (UID: \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\") " pod="openstack/horizon-54dd4bfd97-nctq4" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.785986 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5srm\" (UniqueName: \"kubernetes.io/projected/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-kube-api-access-x5srm\") pod \"ceilometer-0\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " pod="openstack/ceilometer-0" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.786545 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-logs\") pod \"horizon-54dd4bfd97-nctq4\" (UID: \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\") " pod="openstack/horizon-54dd4bfd97-nctq4" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.790098 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-scripts\") pod \"horizon-54dd4bfd97-nctq4\" (UID: \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\") " pod="openstack/horizon-54dd4bfd97-nctq4" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.791332 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-config-data\") pod \"horizon-54dd4bfd97-nctq4\" (UID: \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\") " pod="openstack/horizon-54dd4bfd97-nctq4" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.793145 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-horizon-secret-key\") pod \"horizon-54dd4bfd97-nctq4\" (UID: \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\") " pod="openstack/horizon-54dd4bfd97-nctq4" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.837623 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqkr6\" (UniqueName: \"kubernetes.io/projected/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-kube-api-access-tqkr6\") pod \"horizon-54dd4bfd97-nctq4\" (UID: \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\") " pod="openstack/horizon-54dd4bfd97-nctq4" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.848566 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-tnkcb"] Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.851688 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tnkcb" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.854993 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-n59v5" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.862047 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.889428 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-config-data\") pod \"ceilometer-0\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " pod="openstack/ceilometer-0" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.889671 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " pod="openstack/ceilometer-0" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.889803 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dbca3ac-3960-4572-93c4-04276137f96a-combined-ca-bundle\") pod \"barbican-db-sync-tnkcb\" (UID: \"3dbca3ac-3960-4572-93c4-04276137f96a\") " pod="openstack/barbican-db-sync-tnkcb" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.889904 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " pod="openstack/ceilometer-0" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.890076 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-run-httpd\") pod \"ceilometer-0\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " pod="openstack/ceilometer-0" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.890158 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-scripts\") pod \"ceilometer-0\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " pod="openstack/ceilometer-0" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.890271 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5srm\" (UniqueName: \"kubernetes.io/projected/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-kube-api-access-x5srm\") pod \"ceilometer-0\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " pod="openstack/ceilometer-0" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.890430 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3dbca3ac-3960-4572-93c4-04276137f96a-db-sync-config-data\") pod \"barbican-db-sync-tnkcb\" (UID: \"3dbca3ac-3960-4572-93c4-04276137f96a\") " pod="openstack/barbican-db-sync-tnkcb" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.890537 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4ds8\" (UniqueName: \"kubernetes.io/projected/3dbca3ac-3960-4572-93c4-04276137f96a-kube-api-access-v4ds8\") pod \"barbican-db-sync-tnkcb\" (UID: \"3dbca3ac-3960-4572-93c4-04276137f96a\") " pod="openstack/barbican-db-sync-tnkcb" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.890668 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-log-httpd\") pod \"ceilometer-0\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " pod="openstack/ceilometer-0" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.891295 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-log-httpd\") pod \"ceilometer-0\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " pod="openstack/ceilometer-0" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.896296 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-run-httpd\") pod \"ceilometer-0\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " pod="openstack/ceilometer-0" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.941291 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7577885557-tllnd"] Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.943959 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-scripts\") pod \"ceilometer-0\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " pod="openstack/ceilometer-0" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.944078 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7577885557-tllnd" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.948100 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " pod="openstack/ceilometer-0" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.944082 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " pod="openstack/ceilometer-0" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.949749 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-config-data\") pod \"ceilometer-0\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " pod="openstack/ceilometer-0" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.959558 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5srm\" (UniqueName: \"kubernetes.io/projected/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-kube-api-access-x5srm\") pod \"ceilometer-0\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " pod="openstack/ceilometer-0" Oct 07 19:18:07 crc kubenswrapper[4825]: I1007 19:18:07.986873 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54dd4bfd97-nctq4" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.000647 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-tnkcb"] Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.001886 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dbca3ac-3960-4572-93c4-04276137f96a-combined-ca-bundle\") pod \"barbican-db-sync-tnkcb\" (UID: \"3dbca3ac-3960-4572-93c4-04276137f96a\") " pod="openstack/barbican-db-sync-tnkcb" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.001951 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qml9s\" (UniqueName: \"kubernetes.io/projected/39272589-06ba-4924-9f8d-a72bc499ba6f-kube-api-access-qml9s\") pod \"horizon-7577885557-tllnd\" (UID: \"39272589-06ba-4924-9f8d-a72bc499ba6f\") " pod="openstack/horizon-7577885557-tllnd" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.001985 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/39272589-06ba-4924-9f8d-a72bc499ba6f-config-data\") pod \"horizon-7577885557-tllnd\" (UID: \"39272589-06ba-4924-9f8d-a72bc499ba6f\") " pod="openstack/horizon-7577885557-tllnd" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.002016 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39272589-06ba-4924-9f8d-a72bc499ba6f-scripts\") pod \"horizon-7577885557-tllnd\" (UID: \"39272589-06ba-4924-9f8d-a72bc499ba6f\") " pod="openstack/horizon-7577885557-tllnd" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.002074 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39272589-06ba-4924-9f8d-a72bc499ba6f-logs\") pod \"horizon-7577885557-tllnd\" (UID: \"39272589-06ba-4924-9f8d-a72bc499ba6f\") " pod="openstack/horizon-7577885557-tllnd" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.002099 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3dbca3ac-3960-4572-93c4-04276137f96a-db-sync-config-data\") pod \"barbican-db-sync-tnkcb\" (UID: \"3dbca3ac-3960-4572-93c4-04276137f96a\") " pod="openstack/barbican-db-sync-tnkcb" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.002129 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4ds8\" (UniqueName: \"kubernetes.io/projected/3dbca3ac-3960-4572-93c4-04276137f96a-kube-api-access-v4ds8\") pod \"barbican-db-sync-tnkcb\" (UID: \"3dbca3ac-3960-4572-93c4-04276137f96a\") " pod="openstack/barbican-db-sync-tnkcb" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.002193 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/39272589-06ba-4924-9f8d-a72bc499ba6f-horizon-secret-key\") pod \"horizon-7577885557-tllnd\" (UID: \"39272589-06ba-4924-9f8d-a72bc499ba6f\") " pod="openstack/horizon-7577885557-tllnd" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.009110 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dbca3ac-3960-4572-93c4-04276137f96a-combined-ca-bundle\") pod \"barbican-db-sync-tnkcb\" (UID: \"3dbca3ac-3960-4572-93c4-04276137f96a\") " pod="openstack/barbican-db-sync-tnkcb" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.009173 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-vlkds"] Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.010364 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vlkds" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.012686 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-25xcg" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.015484 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.015828 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.016500 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3dbca3ac-3960-4572-93c4-04276137f96a-db-sync-config-data\") pod \"barbican-db-sync-tnkcb\" (UID: \"3dbca3ac-3960-4572-93c4-04276137f96a\") " pod="openstack/barbican-db-sync-tnkcb" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.027295 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7577885557-tllnd"] Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.032970 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-vlkds"] Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.043290 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-7c55s"] Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.064606 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-qwgl2"] Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.064856 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4ds8\" (UniqueName: \"kubernetes.io/projected/3dbca3ac-3960-4572-93c4-04276137f96a-kube-api-access-v4ds8\") pod \"barbican-db-sync-tnkcb\" (UID: \"3dbca3ac-3960-4572-93c4-04276137f96a\") " pod="openstack/barbican-db-sync-tnkcb" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.066161 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.081198 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-qwgl2"] Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.093139 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-8hldz"] Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.101986 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8hldz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.103434 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba465067-0e79-4d52-bc56-a4b60767eb7d-config-data\") pod \"cinder-db-sync-vlkds\" (UID: \"ba465067-0e79-4d52-bc56-a4b60767eb7d\") " pod="openstack/cinder-db-sync-vlkds" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.103486 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ba465067-0e79-4d52-bc56-a4b60767eb7d-db-sync-config-data\") pod \"cinder-db-sync-vlkds\" (UID: \"ba465067-0e79-4d52-bc56-a4b60767eb7d\") " pod="openstack/cinder-db-sync-vlkds" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.103527 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-qwgl2\" (UID: \"89497435-87e0-4c79-9372-3ad2ae5c3161\") " pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.103571 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba465067-0e79-4d52-bc56-a4b60767eb7d-combined-ca-bundle\") pod \"cinder-db-sync-vlkds\" (UID: \"ba465067-0e79-4d52-bc56-a4b60767eb7d\") " pod="openstack/cinder-db-sync-vlkds" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.104575 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2qjzz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.104737 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.104745 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.104973 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/39272589-06ba-4924-9f8d-a72bc499ba6f-horizon-secret-key\") pod \"horizon-7577885557-tllnd\" (UID: \"39272589-06ba-4924-9f8d-a72bc499ba6f\") " pod="openstack/horizon-7577885557-tllnd" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.105113 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnxrp\" (UniqueName: \"kubernetes.io/projected/89497435-87e0-4c79-9372-3ad2ae5c3161-kube-api-access-pnxrp\") pod \"dnsmasq-dns-76fcf4b695-qwgl2\" (UID: \"89497435-87e0-4c79-9372-3ad2ae5c3161\") " pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.105157 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba465067-0e79-4d52-bc56-a4b60767eb7d-scripts\") pod \"cinder-db-sync-vlkds\" (UID: \"ba465067-0e79-4d52-bc56-a4b60767eb7d\") " pod="openstack/cinder-db-sync-vlkds" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.105172 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-qwgl2\" (UID: \"89497435-87e0-4c79-9372-3ad2ae5c3161\") " pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.105261 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba465067-0e79-4d52-bc56-a4b60767eb7d-etc-machine-id\") pod \"cinder-db-sync-vlkds\" (UID: \"ba465067-0e79-4d52-bc56-a4b60767eb7d\") " pod="openstack/cinder-db-sync-vlkds" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.105306 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qml9s\" (UniqueName: \"kubernetes.io/projected/39272589-06ba-4924-9f8d-a72bc499ba6f-kube-api-access-qml9s\") pod \"horizon-7577885557-tllnd\" (UID: \"39272589-06ba-4924-9f8d-a72bc499ba6f\") " pod="openstack/horizon-7577885557-tllnd" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.105362 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/39272589-06ba-4924-9f8d-a72bc499ba6f-config-data\") pod \"horizon-7577885557-tllnd\" (UID: \"39272589-06ba-4924-9f8d-a72bc499ba6f\") " pod="openstack/horizon-7577885557-tllnd" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.105417 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39272589-06ba-4924-9f8d-a72bc499ba6f-scripts\") pod \"horizon-7577885557-tllnd\" (UID: \"39272589-06ba-4924-9f8d-a72bc499ba6f\") " pod="openstack/horizon-7577885557-tllnd" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.105440 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-qwgl2\" (UID: \"89497435-87e0-4c79-9372-3ad2ae5c3161\") " pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.105475 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-qwgl2\" (UID: \"89497435-87e0-4c79-9372-3ad2ae5c3161\") " pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.105542 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrmz2\" (UniqueName: \"kubernetes.io/projected/ba465067-0e79-4d52-bc56-a4b60767eb7d-kube-api-access-zrmz2\") pod \"cinder-db-sync-vlkds\" (UID: \"ba465067-0e79-4d52-bc56-a4b60767eb7d\") " pod="openstack/cinder-db-sync-vlkds" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.105627 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39272589-06ba-4924-9f8d-a72bc499ba6f-logs\") pod \"horizon-7577885557-tllnd\" (UID: \"39272589-06ba-4924-9f8d-a72bc499ba6f\") " pod="openstack/horizon-7577885557-tllnd" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.105663 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-config\") pod \"dnsmasq-dns-76fcf4b695-qwgl2\" (UID: \"89497435-87e0-4c79-9372-3ad2ae5c3161\") " pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.106288 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39272589-06ba-4924-9f8d-a72bc499ba6f-scripts\") pod \"horizon-7577885557-tllnd\" (UID: \"39272589-06ba-4924-9f8d-a72bc499ba6f\") " pod="openstack/horizon-7577885557-tllnd" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.106582 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39272589-06ba-4924-9f8d-a72bc499ba6f-logs\") pod \"horizon-7577885557-tllnd\" (UID: \"39272589-06ba-4924-9f8d-a72bc499ba6f\") " pod="openstack/horizon-7577885557-tllnd" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.106795 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/39272589-06ba-4924-9f8d-a72bc499ba6f-config-data\") pod \"horizon-7577885557-tllnd\" (UID: \"39272589-06ba-4924-9f8d-a72bc499ba6f\") " pod="openstack/horizon-7577885557-tllnd" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.108720 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/39272589-06ba-4924-9f8d-a72bc499ba6f-horizon-secret-key\") pod \"horizon-7577885557-tllnd\" (UID: \"39272589-06ba-4924-9f8d-a72bc499ba6f\") " pod="openstack/horizon-7577885557-tllnd" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.124764 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8hldz"] Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.149357 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qml9s\" (UniqueName: \"kubernetes.io/projected/39272589-06ba-4924-9f8d-a72bc499ba6f-kube-api-access-qml9s\") pod \"horizon-7577885557-tllnd\" (UID: \"39272589-06ba-4924-9f8d-a72bc499ba6f\") " pod="openstack/horizon-7577885557-tllnd" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.155882 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.158718 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-cccsz"] Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.160057 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cccsz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.163513 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-69wjq" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.163521 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.163821 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.168654 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-cccsz"] Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.205584 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tnkcb" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.209893 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba465067-0e79-4d52-bc56-a4b60767eb7d-scripts\") pod \"cinder-db-sync-vlkds\" (UID: \"ba465067-0e79-4d52-bc56-a4b60767eb7d\") " pod="openstack/cinder-db-sync-vlkds" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.209922 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-qwgl2\" (UID: \"89497435-87e0-4c79-9372-3ad2ae5c3161\") " pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.209953 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba465067-0e79-4d52-bc56-a4b60767eb7d-etc-machine-id\") pod \"cinder-db-sync-vlkds\" (UID: \"ba465067-0e79-4d52-bc56-a4b60767eb7d\") " pod="openstack/cinder-db-sync-vlkds" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.209977 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjfwv\" (UniqueName: \"kubernetes.io/projected/019abaa0-c821-4f8c-a195-a9ea7bc81f8b-kube-api-access-bjfwv\") pod \"neutron-db-sync-cccsz\" (UID: \"019abaa0-c821-4f8c-a195-a9ea7bc81f8b\") " pod="openstack/neutron-db-sync-cccsz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.209999 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/019abaa0-c821-4f8c-a195-a9ea7bc81f8b-config\") pod \"neutron-db-sync-cccsz\" (UID: \"019abaa0-c821-4f8c-a195-a9ea7bc81f8b\") " pod="openstack/neutron-db-sync-cccsz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.210030 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-qwgl2\" (UID: \"89497435-87e0-4c79-9372-3ad2ae5c3161\") " pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.210052 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-qwgl2\" (UID: \"89497435-87e0-4c79-9372-3ad2ae5c3161\") " pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.210075 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1abc8e94-8f1f-4195-b476-248206d004bf-scripts\") pod \"placement-db-sync-8hldz\" (UID: \"1abc8e94-8f1f-4195-b476-248206d004bf\") " pod="openstack/placement-db-sync-8hldz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.210096 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrmz2\" (UniqueName: \"kubernetes.io/projected/ba465067-0e79-4d52-bc56-a4b60767eb7d-kube-api-access-zrmz2\") pod \"cinder-db-sync-vlkds\" (UID: \"ba465067-0e79-4d52-bc56-a4b60767eb7d\") " pod="openstack/cinder-db-sync-vlkds" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.210115 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv6s7\" (UniqueName: \"kubernetes.io/projected/1abc8e94-8f1f-4195-b476-248206d004bf-kube-api-access-zv6s7\") pod \"placement-db-sync-8hldz\" (UID: \"1abc8e94-8f1f-4195-b476-248206d004bf\") " pod="openstack/placement-db-sync-8hldz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.210157 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-config\") pod \"dnsmasq-dns-76fcf4b695-qwgl2\" (UID: \"89497435-87e0-4c79-9372-3ad2ae5c3161\") " pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.210175 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019abaa0-c821-4f8c-a195-a9ea7bc81f8b-combined-ca-bundle\") pod \"neutron-db-sync-cccsz\" (UID: \"019abaa0-c821-4f8c-a195-a9ea7bc81f8b\") " pod="openstack/neutron-db-sync-cccsz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.210192 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1abc8e94-8f1f-4195-b476-248206d004bf-logs\") pod \"placement-db-sync-8hldz\" (UID: \"1abc8e94-8f1f-4195-b476-248206d004bf\") " pod="openstack/placement-db-sync-8hldz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.210208 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1abc8e94-8f1f-4195-b476-248206d004bf-config-data\") pod \"placement-db-sync-8hldz\" (UID: \"1abc8e94-8f1f-4195-b476-248206d004bf\") " pod="openstack/placement-db-sync-8hldz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.210984 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-qwgl2\" (UID: \"89497435-87e0-4c79-9372-3ad2ae5c3161\") " pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.212753 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1abc8e94-8f1f-4195-b476-248206d004bf-combined-ca-bundle\") pod \"placement-db-sync-8hldz\" (UID: \"1abc8e94-8f1f-4195-b476-248206d004bf\") " pod="openstack/placement-db-sync-8hldz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.212800 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba465067-0e79-4d52-bc56-a4b60767eb7d-config-data\") pod \"cinder-db-sync-vlkds\" (UID: \"ba465067-0e79-4d52-bc56-a4b60767eb7d\") " pod="openstack/cinder-db-sync-vlkds" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.212822 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ba465067-0e79-4d52-bc56-a4b60767eb7d-db-sync-config-data\") pod \"cinder-db-sync-vlkds\" (UID: \"ba465067-0e79-4d52-bc56-a4b60767eb7d\") " pod="openstack/cinder-db-sync-vlkds" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.212855 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-qwgl2\" (UID: \"89497435-87e0-4c79-9372-3ad2ae5c3161\") " pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.212885 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba465067-0e79-4d52-bc56-a4b60767eb7d-combined-ca-bundle\") pod \"cinder-db-sync-vlkds\" (UID: \"ba465067-0e79-4d52-bc56-a4b60767eb7d\") " pod="openstack/cinder-db-sync-vlkds" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.212908 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnxrp\" (UniqueName: \"kubernetes.io/projected/89497435-87e0-4c79-9372-3ad2ae5c3161-kube-api-access-pnxrp\") pod \"dnsmasq-dns-76fcf4b695-qwgl2\" (UID: \"89497435-87e0-4c79-9372-3ad2ae5c3161\") " pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.214034 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba465067-0e79-4d52-bc56-a4b60767eb7d-etc-machine-id\") pod \"cinder-db-sync-vlkds\" (UID: \"ba465067-0e79-4d52-bc56-a4b60767eb7d\") " pod="openstack/cinder-db-sync-vlkds" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.214350 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-qwgl2\" (UID: \"89497435-87e0-4c79-9372-3ad2ae5c3161\") " pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.214846 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-qwgl2\" (UID: \"89497435-87e0-4c79-9372-3ad2ae5c3161\") " pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.215120 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-qwgl2\" (UID: \"89497435-87e0-4c79-9372-3ad2ae5c3161\") " pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.220062 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba465067-0e79-4d52-bc56-a4b60767eb7d-scripts\") pod \"cinder-db-sync-vlkds\" (UID: \"ba465067-0e79-4d52-bc56-a4b60767eb7d\") " pod="openstack/cinder-db-sync-vlkds" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.220089 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba465067-0e79-4d52-bc56-a4b60767eb7d-combined-ca-bundle\") pod \"cinder-db-sync-vlkds\" (UID: \"ba465067-0e79-4d52-bc56-a4b60767eb7d\") " pod="openstack/cinder-db-sync-vlkds" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.220563 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba465067-0e79-4d52-bc56-a4b60767eb7d-config-data\") pod \"cinder-db-sync-vlkds\" (UID: \"ba465067-0e79-4d52-bc56-a4b60767eb7d\") " pod="openstack/cinder-db-sync-vlkds" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.221868 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-config\") pod \"dnsmasq-dns-76fcf4b695-qwgl2\" (UID: \"89497435-87e0-4c79-9372-3ad2ae5c3161\") " pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.222852 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ba465067-0e79-4d52-bc56-a4b60767eb7d-db-sync-config-data\") pod \"cinder-db-sync-vlkds\" (UID: \"ba465067-0e79-4d52-bc56-a4b60767eb7d\") " pod="openstack/cinder-db-sync-vlkds" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.229433 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnxrp\" (UniqueName: \"kubernetes.io/projected/89497435-87e0-4c79-9372-3ad2ae5c3161-kube-api-access-pnxrp\") pod \"dnsmasq-dns-76fcf4b695-qwgl2\" (UID: \"89497435-87e0-4c79-9372-3ad2ae5c3161\") " pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.230603 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrmz2\" (UniqueName: \"kubernetes.io/projected/ba465067-0e79-4d52-bc56-a4b60767eb7d-kube-api-access-zrmz2\") pod \"cinder-db-sync-vlkds\" (UID: \"ba465067-0e79-4d52-bc56-a4b60767eb7d\") " pod="openstack/cinder-db-sync-vlkds" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.306405 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7577885557-tllnd" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.315155 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019abaa0-c821-4f8c-a195-a9ea7bc81f8b-combined-ca-bundle\") pod \"neutron-db-sync-cccsz\" (UID: \"019abaa0-c821-4f8c-a195-a9ea7bc81f8b\") " pod="openstack/neutron-db-sync-cccsz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.315913 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1abc8e94-8f1f-4195-b476-248206d004bf-logs\") pod \"placement-db-sync-8hldz\" (UID: \"1abc8e94-8f1f-4195-b476-248206d004bf\") " pod="openstack/placement-db-sync-8hldz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.316019 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1abc8e94-8f1f-4195-b476-248206d004bf-config-data\") pod \"placement-db-sync-8hldz\" (UID: \"1abc8e94-8f1f-4195-b476-248206d004bf\") " pod="openstack/placement-db-sync-8hldz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.316262 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1abc8e94-8f1f-4195-b476-248206d004bf-combined-ca-bundle\") pod \"placement-db-sync-8hldz\" (UID: \"1abc8e94-8f1f-4195-b476-248206d004bf\") " pod="openstack/placement-db-sync-8hldz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.316894 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjfwv\" (UniqueName: \"kubernetes.io/projected/019abaa0-c821-4f8c-a195-a9ea7bc81f8b-kube-api-access-bjfwv\") pod \"neutron-db-sync-cccsz\" (UID: \"019abaa0-c821-4f8c-a195-a9ea7bc81f8b\") " pod="openstack/neutron-db-sync-cccsz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.337824 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/019abaa0-c821-4f8c-a195-a9ea7bc81f8b-config\") pod \"neutron-db-sync-cccsz\" (UID: \"019abaa0-c821-4f8c-a195-a9ea7bc81f8b\") " pod="openstack/neutron-db-sync-cccsz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.338100 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1abc8e94-8f1f-4195-b476-248206d004bf-scripts\") pod \"placement-db-sync-8hldz\" (UID: \"1abc8e94-8f1f-4195-b476-248206d004bf\") " pod="openstack/placement-db-sync-8hldz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.338166 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv6s7\" (UniqueName: \"kubernetes.io/projected/1abc8e94-8f1f-4195-b476-248206d004bf-kube-api-access-zv6s7\") pod \"placement-db-sync-8hldz\" (UID: \"1abc8e94-8f1f-4195-b476-248206d004bf\") " pod="openstack/placement-db-sync-8hldz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.332346 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019abaa0-c821-4f8c-a195-a9ea7bc81f8b-combined-ca-bundle\") pod \"neutron-db-sync-cccsz\" (UID: \"019abaa0-c821-4f8c-a195-a9ea7bc81f8b\") " pod="openstack/neutron-db-sync-cccsz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.334171 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjfwv\" (UniqueName: \"kubernetes.io/projected/019abaa0-c821-4f8c-a195-a9ea7bc81f8b-kube-api-access-bjfwv\") pod \"neutron-db-sync-cccsz\" (UID: \"019abaa0-c821-4f8c-a195-a9ea7bc81f8b\") " pod="openstack/neutron-db-sync-cccsz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.318251 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1abc8e94-8f1f-4195-b476-248206d004bf-logs\") pod \"placement-db-sync-8hldz\" (UID: \"1abc8e94-8f1f-4195-b476-248206d004bf\") " pod="openstack/placement-db-sync-8hldz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.322375 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1abc8e94-8f1f-4195-b476-248206d004bf-combined-ca-bundle\") pod \"placement-db-sync-8hldz\" (UID: \"1abc8e94-8f1f-4195-b476-248206d004bf\") " pod="openstack/placement-db-sync-8hldz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.323543 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1abc8e94-8f1f-4195-b476-248206d004bf-config-data\") pod \"placement-db-sync-8hldz\" (UID: \"1abc8e94-8f1f-4195-b476-248206d004bf\") " pod="openstack/placement-db-sync-8hldz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.345059 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1abc8e94-8f1f-4195-b476-248206d004bf-scripts\") pod \"placement-db-sync-8hldz\" (UID: \"1abc8e94-8f1f-4195-b476-248206d004bf\") " pod="openstack/placement-db-sync-8hldz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.355186 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/019abaa0-c821-4f8c-a195-a9ea7bc81f8b-config\") pod \"neutron-db-sync-cccsz\" (UID: \"019abaa0-c821-4f8c-a195-a9ea7bc81f8b\") " pod="openstack/neutron-db-sync-cccsz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.357274 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vlkds" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.358962 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv6s7\" (UniqueName: \"kubernetes.io/projected/1abc8e94-8f1f-4195-b476-248206d004bf-kube-api-access-zv6s7\") pod \"placement-db-sync-8hldz\" (UID: \"1abc8e94-8f1f-4195-b476-248206d004bf\") " pod="openstack/placement-db-sync-8hldz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.406135 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.457571 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8hldz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.476417 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cccsz" Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.486927 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-v2xq9"] Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.518358 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-7c55s"] Oct 07 19:18:08 crc kubenswrapper[4825]: W1007 19:18:08.545768 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda83e5e77_aee7_49b9_afb7_ce042e674026.slice/crio-a7278e30c2ab67fa3389d56247e4685ceb19d7cf5ee81a7ea56fcb91a6154f8a WatchSource:0}: Error finding container a7278e30c2ab67fa3389d56247e4685ceb19d7cf5ee81a7ea56fcb91a6154f8a: Status 404 returned error can't find the container with id a7278e30c2ab67fa3389d56247e4685ceb19d7cf5ee81a7ea56fcb91a6154f8a Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.680604 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54dd4bfd97-nctq4"] Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.741742 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-tnkcb"] Oct 07 19:18:08 crc kubenswrapper[4825]: I1007 19:18:08.759633 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:18:08 crc kubenswrapper[4825]: W1007 19:18:08.768648 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dbca3ac_3960_4572_93c4_04276137f96a.slice/crio-1b59aad5bc2b0e4b14c44b2fd8bd5bfac17ddd4ef1a28ac60cef9a12bcc0478c WatchSource:0}: Error finding container 1b59aad5bc2b0e4b14c44b2fd8bd5bfac17ddd4ef1a28ac60cef9a12bcc0478c: Status 404 returned error can't find the container with id 1b59aad5bc2b0e4b14c44b2fd8bd5bfac17ddd4ef1a28ac60cef9a12bcc0478c Oct 07 19:18:09 crc kubenswrapper[4825]: I1007 19:18:09.048389 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-vlkds"] Oct 07 19:18:09 crc kubenswrapper[4825]: I1007 19:18:09.098059 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8hldz"] Oct 07 19:18:09 crc kubenswrapper[4825]: W1007 19:18:09.113547 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89497435_87e0_4c79_9372_3ad2ae5c3161.slice/crio-0bf9f0799ba26f8ca9332cd60a72bef02be9ad4632a100fa776c4cbc4565e553 WatchSource:0}: Error finding container 0bf9f0799ba26f8ca9332cd60a72bef02be9ad4632a100fa776c4cbc4565e553: Status 404 returned error can't find the container with id 0bf9f0799ba26f8ca9332cd60a72bef02be9ad4632a100fa776c4cbc4565e553 Oct 07 19:18:09 crc kubenswrapper[4825]: I1007 19:18:09.117536 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7577885557-tllnd"] Oct 07 19:18:09 crc kubenswrapper[4825]: I1007 19:18:09.125135 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-qwgl2"] Oct 07 19:18:09 crc kubenswrapper[4825]: I1007 19:18:09.166325 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tnkcb" event={"ID":"3dbca3ac-3960-4572-93c4-04276137f96a","Type":"ContainerStarted","Data":"1b59aad5bc2b0e4b14c44b2fd8bd5bfac17ddd4ef1a28ac60cef9a12bcc0478c"} Oct 07 19:18:09 crc kubenswrapper[4825]: I1007 19:18:09.168443 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54dd4bfd97-nctq4" event={"ID":"11b2ace9-8b4b-4b81-ac2a-602ad2860c67","Type":"ContainerStarted","Data":"5a4df37f8bf2fa45a52c4f5850cf77ea4e352dcc286646588d579ffe92d7ed30"} Oct 07 19:18:09 crc kubenswrapper[4825]: I1007 19:18:09.170171 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-v2xq9" event={"ID":"a5b707ea-501b-470e-8016-22c333a3f90a","Type":"ContainerStarted","Data":"40621b2130780439219c836e4dd5f0a90517746ff37a56bcf45a53e4dab5c21f"} Oct 07 19:18:09 crc kubenswrapper[4825]: I1007 19:18:09.170214 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-v2xq9" event={"ID":"a5b707ea-501b-470e-8016-22c333a3f90a","Type":"ContainerStarted","Data":"df3d23a09390a80095c72bf4113deb62383b7d8468108206d74c79617273b982"} Oct 07 19:18:09 crc kubenswrapper[4825]: I1007 19:18:09.172173 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" event={"ID":"89497435-87e0-4c79-9372-3ad2ae5c3161","Type":"ContainerStarted","Data":"0bf9f0799ba26f8ca9332cd60a72bef02be9ad4632a100fa776c4cbc4565e553"} Oct 07 19:18:09 crc kubenswrapper[4825]: I1007 19:18:09.173076 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15","Type":"ContainerStarted","Data":"ef067aaf69830f4ddb564bc3588e55e04ba8e33c21200559a93c30ce956b0c9a"} Oct 07 19:18:09 crc kubenswrapper[4825]: I1007 19:18:09.177629 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vlkds" event={"ID":"ba465067-0e79-4d52-bc56-a4b60767eb7d","Type":"ContainerStarted","Data":"6444a930ea0d25a394f49662cf1474644cf9dc30fddaf5cf1d27f748730f25f5"} Oct 07 19:18:09 crc kubenswrapper[4825]: I1007 19:18:09.179326 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8hldz" event={"ID":"1abc8e94-8f1f-4195-b476-248206d004bf","Type":"ContainerStarted","Data":"ee434fdd336abb19037a78d6d8e7ab2939811a07512343010a3ee83448f2338e"} Oct 07 19:18:09 crc kubenswrapper[4825]: I1007 19:18:09.181658 4825 generic.go:334] "Generic (PLEG): container finished" podID="a83e5e77-aee7-49b9-afb7-ce042e674026" containerID="1c1356ae030bb271598fbdc532fefb55131c41da64dc7fbff5e66e2ca61dc7ce" exitCode=0 Oct 07 19:18:09 crc kubenswrapper[4825]: I1007 19:18:09.181727 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-7c55s" event={"ID":"a83e5e77-aee7-49b9-afb7-ce042e674026","Type":"ContainerDied","Data":"1c1356ae030bb271598fbdc532fefb55131c41da64dc7fbff5e66e2ca61dc7ce"} Oct 07 19:18:09 crc kubenswrapper[4825]: I1007 19:18:09.181750 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-7c55s" event={"ID":"a83e5e77-aee7-49b9-afb7-ce042e674026","Type":"ContainerStarted","Data":"a7278e30c2ab67fa3389d56247e4685ceb19d7cf5ee81a7ea56fcb91a6154f8a"} Oct 07 19:18:09 crc kubenswrapper[4825]: I1007 19:18:09.189088 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7577885557-tllnd" event={"ID":"39272589-06ba-4924-9f8d-a72bc499ba6f","Type":"ContainerStarted","Data":"e2d3764bb8b937a36a0d4284bc81b08c89d163d2876024b89f74bd7ab575debc"} Oct 07 19:18:09 crc kubenswrapper[4825]: I1007 19:18:09.192545 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-v2xq9" podStartSLOduration=2.192501574 podStartE2EDuration="2.192501574s" podCreationTimestamp="2025-10-07 19:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:18:09.183169635 +0000 UTC m=+1078.005208262" watchObservedRunningTime="2025-10-07 19:18:09.192501574 +0000 UTC m=+1078.014540211" Oct 07 19:18:09 crc kubenswrapper[4825]: I1007 19:18:09.234534 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-cccsz"] Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.468006 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54dd4bfd97-nctq4"] Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.547069 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7cb7996c8f-f7mzb"] Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.561641 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cb7996c8f-f7mzb" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.563704 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7cb7996c8f-f7mzb"] Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.675200 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8d75938-430b-4a74-87aa-50e8acdd63e0-config-data\") pod \"horizon-7cb7996c8f-f7mzb\" (UID: \"b8d75938-430b-4a74-87aa-50e8acdd63e0\") " pod="openstack/horizon-7cb7996c8f-f7mzb" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.675288 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8d75938-430b-4a74-87aa-50e8acdd63e0-logs\") pod \"horizon-7cb7996c8f-f7mzb\" (UID: \"b8d75938-430b-4a74-87aa-50e8acdd63e0\") " pod="openstack/horizon-7cb7996c8f-f7mzb" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.675392 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b8d75938-430b-4a74-87aa-50e8acdd63e0-horizon-secret-key\") pod \"horizon-7cb7996c8f-f7mzb\" (UID: \"b8d75938-430b-4a74-87aa-50e8acdd63e0\") " pod="openstack/horizon-7cb7996c8f-f7mzb" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.675469 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9hvs\" (UniqueName: \"kubernetes.io/projected/b8d75938-430b-4a74-87aa-50e8acdd63e0-kube-api-access-m9hvs\") pod \"horizon-7cb7996c8f-f7mzb\" (UID: \"b8d75938-430b-4a74-87aa-50e8acdd63e0\") " pod="openstack/horizon-7cb7996c8f-f7mzb" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.675512 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8d75938-430b-4a74-87aa-50e8acdd63e0-scripts\") pod \"horizon-7cb7996c8f-f7mzb\" (UID: \"b8d75938-430b-4a74-87aa-50e8acdd63e0\") " pod="openstack/horizon-7cb7996c8f-f7mzb" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.753656 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-7c55s" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.778798 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-ovsdbserver-nb\") pod \"a83e5e77-aee7-49b9-afb7-ce042e674026\" (UID: \"a83e5e77-aee7-49b9-afb7-ce042e674026\") " Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.778883 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsqpm\" (UniqueName: \"kubernetes.io/projected/a83e5e77-aee7-49b9-afb7-ce042e674026-kube-api-access-wsqpm\") pod \"a83e5e77-aee7-49b9-afb7-ce042e674026\" (UID: \"a83e5e77-aee7-49b9-afb7-ce042e674026\") " Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.778965 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-config\") pod \"a83e5e77-aee7-49b9-afb7-ce042e674026\" (UID: \"a83e5e77-aee7-49b9-afb7-ce042e674026\") " Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.779002 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-ovsdbserver-sb\") pod \"a83e5e77-aee7-49b9-afb7-ce042e674026\" (UID: \"a83e5e77-aee7-49b9-afb7-ce042e674026\") " Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.779067 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-dns-svc\") pod \"a83e5e77-aee7-49b9-afb7-ce042e674026\" (UID: \"a83e5e77-aee7-49b9-afb7-ce042e674026\") " Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.779087 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-dns-swift-storage-0\") pod \"a83e5e77-aee7-49b9-afb7-ce042e674026\" (UID: \"a83e5e77-aee7-49b9-afb7-ce042e674026\") " Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.779279 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8d75938-430b-4a74-87aa-50e8acdd63e0-scripts\") pod \"horizon-7cb7996c8f-f7mzb\" (UID: \"b8d75938-430b-4a74-87aa-50e8acdd63e0\") " pod="openstack/horizon-7cb7996c8f-f7mzb" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.779336 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8d75938-430b-4a74-87aa-50e8acdd63e0-config-data\") pod \"horizon-7cb7996c8f-f7mzb\" (UID: \"b8d75938-430b-4a74-87aa-50e8acdd63e0\") " pod="openstack/horizon-7cb7996c8f-f7mzb" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.779384 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8d75938-430b-4a74-87aa-50e8acdd63e0-logs\") pod \"horizon-7cb7996c8f-f7mzb\" (UID: \"b8d75938-430b-4a74-87aa-50e8acdd63e0\") " pod="openstack/horizon-7cb7996c8f-f7mzb" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.779418 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b8d75938-430b-4a74-87aa-50e8acdd63e0-horizon-secret-key\") pod \"horizon-7cb7996c8f-f7mzb\" (UID: \"b8d75938-430b-4a74-87aa-50e8acdd63e0\") " pod="openstack/horizon-7cb7996c8f-f7mzb" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.779451 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9hvs\" (UniqueName: \"kubernetes.io/projected/b8d75938-430b-4a74-87aa-50e8acdd63e0-kube-api-access-m9hvs\") pod \"horizon-7cb7996c8f-f7mzb\" (UID: \"b8d75938-430b-4a74-87aa-50e8acdd63e0\") " pod="openstack/horizon-7cb7996c8f-f7mzb" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.783623 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8d75938-430b-4a74-87aa-50e8acdd63e0-config-data\") pod \"horizon-7cb7996c8f-f7mzb\" (UID: \"b8d75938-430b-4a74-87aa-50e8acdd63e0\") " pod="openstack/horizon-7cb7996c8f-f7mzb" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.783802 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a83e5e77-aee7-49b9-afb7-ce042e674026-kube-api-access-wsqpm" (OuterVolumeSpecName: "kube-api-access-wsqpm") pod "a83e5e77-aee7-49b9-afb7-ce042e674026" (UID: "a83e5e77-aee7-49b9-afb7-ce042e674026"). InnerVolumeSpecName "kube-api-access-wsqpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.784020 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8d75938-430b-4a74-87aa-50e8acdd63e0-scripts\") pod \"horizon-7cb7996c8f-f7mzb\" (UID: \"b8d75938-430b-4a74-87aa-50e8acdd63e0\") " pod="openstack/horizon-7cb7996c8f-f7mzb" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.784240 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8d75938-430b-4a74-87aa-50e8acdd63e0-logs\") pod \"horizon-7cb7996c8f-f7mzb\" (UID: \"b8d75938-430b-4a74-87aa-50e8acdd63e0\") " pod="openstack/horizon-7cb7996c8f-f7mzb" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.799065 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b8d75938-430b-4a74-87aa-50e8acdd63e0-horizon-secret-key\") pod \"horizon-7cb7996c8f-f7mzb\" (UID: \"b8d75938-430b-4a74-87aa-50e8acdd63e0\") " pod="openstack/horizon-7cb7996c8f-f7mzb" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.805099 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9hvs\" (UniqueName: \"kubernetes.io/projected/b8d75938-430b-4a74-87aa-50e8acdd63e0-kube-api-access-m9hvs\") pod \"horizon-7cb7996c8f-f7mzb\" (UID: \"b8d75938-430b-4a74-87aa-50e8acdd63e0\") " pod="openstack/horizon-7cb7996c8f-f7mzb" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.805759 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-config" (OuterVolumeSpecName: "config") pod "a83e5e77-aee7-49b9-afb7-ce042e674026" (UID: "a83e5e77-aee7-49b9-afb7-ce042e674026"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.808678 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a83e5e77-aee7-49b9-afb7-ce042e674026" (UID: "a83e5e77-aee7-49b9-afb7-ce042e674026"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.828837 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a83e5e77-aee7-49b9-afb7-ce042e674026" (UID: "a83e5e77-aee7-49b9-afb7-ce042e674026"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.844090 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a83e5e77-aee7-49b9-afb7-ce042e674026" (UID: "a83e5e77-aee7-49b9-afb7-ce042e674026"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.847192 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a83e5e77-aee7-49b9-afb7-ce042e674026" (UID: "a83e5e77-aee7-49b9-afb7-ce042e674026"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.882848 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.882877 4825 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.882905 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.882916 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsqpm\" (UniqueName: \"kubernetes.io/projected/a83e5e77-aee7-49b9-afb7-ce042e674026-kube-api-access-wsqpm\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.882926 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:09.882933 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a83e5e77-aee7-49b9-afb7-ce042e674026-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:10.033687 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cb7996c8f-f7mzb" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:10.207406 4825 generic.go:334] "Generic (PLEG): container finished" podID="89497435-87e0-4c79-9372-3ad2ae5c3161" containerID="9b63d1280eb9502ddd8627486ff116af835d88b173f9e7a88616c9b2fd951e8b" exitCode=0 Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:10.207765 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" event={"ID":"89497435-87e0-4c79-9372-3ad2ae5c3161","Type":"ContainerDied","Data":"9b63d1280eb9502ddd8627486ff116af835d88b173f9e7a88616c9b2fd951e8b"} Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:10.209721 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-cw4gf" event={"ID":"e981b526-0afb-4a9c-ba89-fe87728f4603","Type":"ContainerDied","Data":"2db6a6ec54cde59acb7506828b34a2ca3ab8872828ab4031ac5dfafa166331c6"} Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:10.210536 4825 generic.go:334] "Generic (PLEG): container finished" podID="e981b526-0afb-4a9c-ba89-fe87728f4603" containerID="2db6a6ec54cde59acb7506828b34a2ca3ab8872828ab4031ac5dfafa166331c6" exitCode=0 Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:10.217069 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cccsz" event={"ID":"019abaa0-c821-4f8c-a195-a9ea7bc81f8b","Type":"ContainerStarted","Data":"5468c16d9290fd9e1ab84b25c93a6102553ae19cdd636995b2fe7c94302ed5b1"} Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:10.217125 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cccsz" event={"ID":"019abaa0-c821-4f8c-a195-a9ea7bc81f8b","Type":"ContainerStarted","Data":"bc1c44a04c1f57f3399d033b2349de96a60236c0a7729c90f6b8290d19baf628"} Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:10.219282 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-7c55s" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:10.219315 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-7c55s" event={"ID":"a83e5e77-aee7-49b9-afb7-ce042e674026","Type":"ContainerDied","Data":"a7278e30c2ab67fa3389d56247e4685ceb19d7cf5ee81a7ea56fcb91a6154f8a"} Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:10.219358 4825 scope.go:117] "RemoveContainer" containerID="1c1356ae030bb271598fbdc532fefb55131c41da64dc7fbff5e66e2ca61dc7ce" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:10.274024 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-cccsz" podStartSLOduration=3.273973963 podStartE2EDuration="3.273973963s" podCreationTimestamp="2025-10-07 19:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:18:10.23980986 +0000 UTC m=+1079.061848497" watchObservedRunningTime="2025-10-07 19:18:10.273973963 +0000 UTC m=+1079.096012610" Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:10.345035 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-7c55s"] Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:10.353375 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-7c55s"] Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:10.363342 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:18:10 crc kubenswrapper[4825]: I1007 19:18:10.659415 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7cb7996c8f-f7mzb"] Oct 07 19:18:10 crc kubenswrapper[4825]: W1007 19:18:10.667937 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8d75938_430b_4a74_87aa_50e8acdd63e0.slice/crio-30383b2c64d209329393a118c552db205ef59934d5ef171b6543faa48940c7a0 WatchSource:0}: Error finding container 30383b2c64d209329393a118c552db205ef59934d5ef171b6543faa48940c7a0: Status 404 returned error can't find the container with id 30383b2c64d209329393a118c552db205ef59934d5ef171b6543faa48940c7a0 Oct 07 19:18:11 crc kubenswrapper[4825]: I1007 19:18:11.230875 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cb7996c8f-f7mzb" event={"ID":"b8d75938-430b-4a74-87aa-50e8acdd63e0","Type":"ContainerStarted","Data":"30383b2c64d209329393a118c552db205ef59934d5ef171b6543faa48940c7a0"} Oct 07 19:18:11 crc kubenswrapper[4825]: I1007 19:18:11.245175 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" event={"ID":"89497435-87e0-4c79-9372-3ad2ae5c3161","Type":"ContainerStarted","Data":"9e19c485d0704e21677e9d62b4df3ec1cbdd7d5475d0fe17901a8d7954602f37"} Oct 07 19:18:11 crc kubenswrapper[4825]: I1007 19:18:11.245509 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" Oct 07 19:18:11 crc kubenswrapper[4825]: I1007 19:18:11.269206 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" podStartSLOduration=4.269187351 podStartE2EDuration="4.269187351s" podCreationTimestamp="2025-10-07 19:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:18:11.266012959 +0000 UTC m=+1080.088051596" watchObservedRunningTime="2025-10-07 19:18:11.269187351 +0000 UTC m=+1080.091225988" Oct 07 19:18:11 crc kubenswrapper[4825]: I1007 19:18:11.657070 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-cw4gf" Oct 07 19:18:11 crc kubenswrapper[4825]: I1007 19:18:11.720430 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh9ds\" (UniqueName: \"kubernetes.io/projected/e981b526-0afb-4a9c-ba89-fe87728f4603-kube-api-access-kh9ds\") pod \"e981b526-0afb-4a9c-ba89-fe87728f4603\" (UID: \"e981b526-0afb-4a9c-ba89-fe87728f4603\") " Oct 07 19:18:11 crc kubenswrapper[4825]: I1007 19:18:11.720504 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e981b526-0afb-4a9c-ba89-fe87728f4603-config-data\") pod \"e981b526-0afb-4a9c-ba89-fe87728f4603\" (UID: \"e981b526-0afb-4a9c-ba89-fe87728f4603\") " Oct 07 19:18:11 crc kubenswrapper[4825]: I1007 19:18:11.721466 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e981b526-0afb-4a9c-ba89-fe87728f4603-db-sync-config-data\") pod \"e981b526-0afb-4a9c-ba89-fe87728f4603\" (UID: \"e981b526-0afb-4a9c-ba89-fe87728f4603\") " Oct 07 19:18:11 crc kubenswrapper[4825]: I1007 19:18:11.721530 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e981b526-0afb-4a9c-ba89-fe87728f4603-combined-ca-bundle\") pod \"e981b526-0afb-4a9c-ba89-fe87728f4603\" (UID: \"e981b526-0afb-4a9c-ba89-fe87728f4603\") " Oct 07 19:18:11 crc kubenswrapper[4825]: I1007 19:18:11.729795 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e981b526-0afb-4a9c-ba89-fe87728f4603-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e981b526-0afb-4a9c-ba89-fe87728f4603" (UID: "e981b526-0afb-4a9c-ba89-fe87728f4603"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:11 crc kubenswrapper[4825]: I1007 19:18:11.732457 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e981b526-0afb-4a9c-ba89-fe87728f4603-kube-api-access-kh9ds" (OuterVolumeSpecName: "kube-api-access-kh9ds") pod "e981b526-0afb-4a9c-ba89-fe87728f4603" (UID: "e981b526-0afb-4a9c-ba89-fe87728f4603"). InnerVolumeSpecName "kube-api-access-kh9ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:18:11 crc kubenswrapper[4825]: I1007 19:18:11.768447 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e981b526-0afb-4a9c-ba89-fe87728f4603-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e981b526-0afb-4a9c-ba89-fe87728f4603" (UID: "e981b526-0afb-4a9c-ba89-fe87728f4603"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:11 crc kubenswrapper[4825]: I1007 19:18:11.810683 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e981b526-0afb-4a9c-ba89-fe87728f4603-config-data" (OuterVolumeSpecName: "config-data") pod "e981b526-0afb-4a9c-ba89-fe87728f4603" (UID: "e981b526-0afb-4a9c-ba89-fe87728f4603"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:11 crc kubenswrapper[4825]: I1007 19:18:11.811308 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a83e5e77-aee7-49b9-afb7-ce042e674026" path="/var/lib/kubelet/pods/a83e5e77-aee7-49b9-afb7-ce042e674026/volumes" Oct 07 19:18:11 crc kubenswrapper[4825]: I1007 19:18:11.823780 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e981b526-0afb-4a9c-ba89-fe87728f4603-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:11 crc kubenswrapper[4825]: I1007 19:18:11.823815 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh9ds\" (UniqueName: \"kubernetes.io/projected/e981b526-0afb-4a9c-ba89-fe87728f4603-kube-api-access-kh9ds\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:11 crc kubenswrapper[4825]: I1007 19:18:11.823828 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e981b526-0afb-4a9c-ba89-fe87728f4603-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:11 crc kubenswrapper[4825]: I1007 19:18:11.823838 4825 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e981b526-0afb-4a9c-ba89-fe87728f4603-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.256148 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-cw4gf" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.257173 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-cw4gf" event={"ID":"e981b526-0afb-4a9c-ba89-fe87728f4603","Type":"ContainerDied","Data":"d13825f1b815c92638c69f6725aa09f0f1b96cee691e74b679418aece4ca46f8"} Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.257208 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d13825f1b815c92638c69f6725aa09f0f1b96cee691e74b679418aece4ca46f8" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.630445 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-qwgl2"] Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.664860 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-5qdgx"] Oct 07 19:18:12 crc kubenswrapper[4825]: E1007 19:18:12.665212 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e981b526-0afb-4a9c-ba89-fe87728f4603" containerName="glance-db-sync" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.665240 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e981b526-0afb-4a9c-ba89-fe87728f4603" containerName="glance-db-sync" Oct 07 19:18:12 crc kubenswrapper[4825]: E1007 19:18:12.665268 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a83e5e77-aee7-49b9-afb7-ce042e674026" containerName="init" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.665275 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a83e5e77-aee7-49b9-afb7-ce042e674026" containerName="init" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.665435 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a83e5e77-aee7-49b9-afb7-ce042e674026" containerName="init" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.665454 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e981b526-0afb-4a9c-ba89-fe87728f4603" containerName="glance-db-sync" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.666352 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.692445 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-5qdgx"] Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.742740 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-5qdgx\" (UID: \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\") " pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.742785 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-config\") pod \"dnsmasq-dns-8b5c85b87-5qdgx\" (UID: \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\") " pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.742807 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn9l2\" (UniqueName: \"kubernetes.io/projected/5fddc514-0ec2-4022-9971-75c8dd44ef6c-kube-api-access-gn9l2\") pod \"dnsmasq-dns-8b5c85b87-5qdgx\" (UID: \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\") " pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.742826 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-5qdgx\" (UID: \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\") " pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.742853 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-5qdgx\" (UID: \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\") " pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.742888 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-5qdgx\" (UID: \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\") " pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.844698 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-5qdgx\" (UID: \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\") " pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.844759 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-config\") pod \"dnsmasq-dns-8b5c85b87-5qdgx\" (UID: \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\") " pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.844790 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn9l2\" (UniqueName: \"kubernetes.io/projected/5fddc514-0ec2-4022-9971-75c8dd44ef6c-kube-api-access-gn9l2\") pod \"dnsmasq-dns-8b5c85b87-5qdgx\" (UID: \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\") " pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.844812 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-5qdgx\" (UID: \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\") " pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.844839 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-5qdgx\" (UID: \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\") " pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.844877 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-5qdgx\" (UID: \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\") " pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.845639 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-5qdgx\" (UID: \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\") " pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.845864 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-5qdgx\" (UID: \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\") " pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.846330 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-5qdgx\" (UID: \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\") " pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.846411 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-config\") pod \"dnsmasq-dns-8b5c85b87-5qdgx\" (UID: \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\") " pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.846808 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-5qdgx\" (UID: \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\") " pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" Oct 07 19:18:12 crc kubenswrapper[4825]: I1007 19:18:12.863266 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn9l2\" (UniqueName: \"kubernetes.io/projected/5fddc514-0ec2-4022-9971-75c8dd44ef6c-kube-api-access-gn9l2\") pod \"dnsmasq-dns-8b5c85b87-5qdgx\" (UID: \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\") " pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.003838 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.264603 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" podUID="89497435-87e0-4c79-9372-3ad2ae5c3161" containerName="dnsmasq-dns" containerID="cri-o://9e19c485d0704e21677e9d62b4df3ec1cbdd7d5475d0fe17901a8d7954602f37" gracePeriod=10 Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.716279 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.718768 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.720362 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-gq8x4" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.720609 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.720751 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.725438 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.760786 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-logs\") pod \"glance-default-external-api-0\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.760832 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.760856 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.760884 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-scripts\") pod \"glance-default-external-api-0\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.760928 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-config-data\") pod \"glance-default-external-api-0\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.760966 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z565j\" (UniqueName: \"kubernetes.io/projected/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-kube-api-access-z565j\") pod \"glance-default-external-api-0\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.760988 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.770284 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.771820 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.773778 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.781276 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.862984 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-config-data\") pod \"glance-default-external-api-0\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.863080 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z565j\" (UniqueName: \"kubernetes.io/projected/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-kube-api-access-z565j\") pod \"glance-default-external-api-0\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.863157 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.863300 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-logs\") pod \"glance-default-external-api-0\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.863326 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.863403 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.863454 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-scripts\") pod \"glance-default-external-api-0\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.864971 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-logs\") pod \"glance-default-external-api-0\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.864974 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.866865 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.873601 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-config-data\") pod \"glance-default-external-api-0\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.874418 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-scripts\") pod \"glance-default-external-api-0\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.878670 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.880358 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z565j\" (UniqueName: \"kubernetes.io/projected/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-kube-api-access-z565j\") pod \"glance-default-external-api-0\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.900055 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.964610 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c0c4637-00a5-4fd1-8105-4b36770b00b6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.964713 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c0c4637-00a5-4fd1-8105-4b36770b00b6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.964798 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c0c4637-00a5-4fd1-8105-4b36770b00b6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.964835 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c0c4637-00a5-4fd1-8105-4b36770b00b6-logs\") pod \"glance-default-internal-api-0\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.964863 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c0c4637-00a5-4fd1-8105-4b36770b00b6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.964894 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:13 crc kubenswrapper[4825]: I1007 19:18:13.964923 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmccg\" (UniqueName: \"kubernetes.io/projected/4c0c4637-00a5-4fd1-8105-4b36770b00b6-kube-api-access-jmccg\") pod \"glance-default-internal-api-0\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:14 crc kubenswrapper[4825]: I1007 19:18:14.055486 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 19:18:14 crc kubenswrapper[4825]: I1007 19:18:14.066892 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c0c4637-00a5-4fd1-8105-4b36770b00b6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:14 crc kubenswrapper[4825]: I1007 19:18:14.066954 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c0c4637-00a5-4fd1-8105-4b36770b00b6-logs\") pod \"glance-default-internal-api-0\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:14 crc kubenswrapper[4825]: I1007 19:18:14.066984 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c0c4637-00a5-4fd1-8105-4b36770b00b6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:14 crc kubenswrapper[4825]: I1007 19:18:14.067014 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:14 crc kubenswrapper[4825]: I1007 19:18:14.067044 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmccg\" (UniqueName: \"kubernetes.io/projected/4c0c4637-00a5-4fd1-8105-4b36770b00b6-kube-api-access-jmccg\") pod \"glance-default-internal-api-0\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:14 crc kubenswrapper[4825]: I1007 19:18:14.067099 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c0c4637-00a5-4fd1-8105-4b36770b00b6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:14 crc kubenswrapper[4825]: I1007 19:18:14.067167 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c0c4637-00a5-4fd1-8105-4b36770b00b6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:14 crc kubenswrapper[4825]: I1007 19:18:14.068441 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 07 19:18:14 crc kubenswrapper[4825]: I1007 19:18:14.068526 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c0c4637-00a5-4fd1-8105-4b36770b00b6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:14 crc kubenswrapper[4825]: I1007 19:18:14.068650 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c0c4637-00a5-4fd1-8105-4b36770b00b6-logs\") pod \"glance-default-internal-api-0\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:14 crc kubenswrapper[4825]: I1007 19:18:14.082547 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c0c4637-00a5-4fd1-8105-4b36770b00b6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:14 crc kubenswrapper[4825]: I1007 19:18:14.085351 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c0c4637-00a5-4fd1-8105-4b36770b00b6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:14 crc kubenswrapper[4825]: I1007 19:18:14.090340 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c0c4637-00a5-4fd1-8105-4b36770b00b6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:14 crc kubenswrapper[4825]: I1007 19:18:14.092384 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmccg\" (UniqueName: \"kubernetes.io/projected/4c0c4637-00a5-4fd1-8105-4b36770b00b6-kube-api-access-jmccg\") pod \"glance-default-internal-api-0\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:14 crc kubenswrapper[4825]: I1007 19:18:14.106201 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:14 crc kubenswrapper[4825]: I1007 19:18:14.280082 4825 generic.go:334] "Generic (PLEG): container finished" podID="89497435-87e0-4c79-9372-3ad2ae5c3161" containerID="9e19c485d0704e21677e9d62b4df3ec1cbdd7d5475d0fe17901a8d7954602f37" exitCode=0 Oct 07 19:18:14 crc kubenswrapper[4825]: I1007 19:18:14.280131 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" event={"ID":"89497435-87e0-4c79-9372-3ad2ae5c3161","Type":"ContainerDied","Data":"9e19c485d0704e21677e9d62b4df3ec1cbdd7d5475d0fe17901a8d7954602f37"} Oct 07 19:18:14 crc kubenswrapper[4825]: I1007 19:18:14.406986 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 19:18:15 crc kubenswrapper[4825]: I1007 19:18:15.292380 4825 generic.go:334] "Generic (PLEG): container finished" podID="a5b707ea-501b-470e-8016-22c333a3f90a" containerID="40621b2130780439219c836e4dd5f0a90517746ff37a56bcf45a53e4dab5c21f" exitCode=0 Oct 07 19:18:15 crc kubenswrapper[4825]: I1007 19:18:15.292455 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-v2xq9" event={"ID":"a5b707ea-501b-470e-8016-22c333a3f90a","Type":"ContainerDied","Data":"40621b2130780439219c836e4dd5f0a90517746ff37a56bcf45a53e4dab5c21f"} Oct 07 19:18:18 crc kubenswrapper[4825]: I1007 19:18:18.411790 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" podUID="89497435-87e0-4c79-9372-3ad2ae5c3161" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: connect: connection refused" Oct 07 19:18:19 crc kubenswrapper[4825]: I1007 19:18:19.674155 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 19:18:19 crc kubenswrapper[4825]: I1007 19:18:19.737798 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.021401 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7577885557-tllnd"] Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.068691 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-566c6c8d88-h74t9"] Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.084108 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.087484 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-566c6c8d88-h74t9"] Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.087831 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.110857 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7cb7996c8f-f7mzb"] Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.157533 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-58d7dd5b56-nhlgz"] Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.159428 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.174133 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58d7dd5b56-nhlgz"] Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.187912 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66fe3a9-9849-4219-badb-a0cecbb2a388-horizon-tls-certs\") pod \"horizon-566c6c8d88-h74t9\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.187952 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b66fe3a9-9849-4219-badb-a0cecbb2a388-config-data\") pod \"horizon-566c6c8d88-h74t9\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.187982 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b66fe3a9-9849-4219-badb-a0cecbb2a388-scripts\") pod \"horizon-566c6c8d88-h74t9\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.188058 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b66fe3a9-9849-4219-badb-a0cecbb2a388-horizon-secret-key\") pod \"horizon-566c6c8d88-h74t9\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.188107 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b66fe3a9-9849-4219-badb-a0cecbb2a388-logs\") pod \"horizon-566c6c8d88-h74t9\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.188132 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b66fe3a9-9849-4219-badb-a0cecbb2a388-combined-ca-bundle\") pod \"horizon-566c6c8d88-h74t9\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.188181 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk2s7\" (UniqueName: \"kubernetes.io/projected/b66fe3a9-9849-4219-badb-a0cecbb2a388-kube-api-access-vk2s7\") pod \"horizon-566c6c8d88-h74t9\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.289383 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66fe3a9-9849-4219-badb-a0cecbb2a388-horizon-tls-certs\") pod \"horizon-566c6c8d88-h74t9\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.289424 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b66fe3a9-9849-4219-badb-a0cecbb2a388-config-data\") pod \"horizon-566c6c8d88-h74t9\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.289459 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b66fe3a9-9849-4219-badb-a0cecbb2a388-scripts\") pod \"horizon-566c6c8d88-h74t9\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.289490 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/710a139f-bf12-4021-b702-3e40d49febf1-horizon-secret-key\") pod \"horizon-58d7dd5b56-nhlgz\" (UID: \"710a139f-bf12-4021-b702-3e40d49febf1\") " pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.289510 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/710a139f-bf12-4021-b702-3e40d49febf1-horizon-tls-certs\") pod \"horizon-58d7dd5b56-nhlgz\" (UID: \"710a139f-bf12-4021-b702-3e40d49febf1\") " pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.289550 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b66fe3a9-9849-4219-badb-a0cecbb2a388-horizon-secret-key\") pod \"horizon-566c6c8d88-h74t9\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.289565 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/710a139f-bf12-4021-b702-3e40d49febf1-config-data\") pod \"horizon-58d7dd5b56-nhlgz\" (UID: \"710a139f-bf12-4021-b702-3e40d49febf1\") " pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.289585 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/710a139f-bf12-4021-b702-3e40d49febf1-combined-ca-bundle\") pod \"horizon-58d7dd5b56-nhlgz\" (UID: \"710a139f-bf12-4021-b702-3e40d49febf1\") " pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.289605 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/710a139f-bf12-4021-b702-3e40d49febf1-scripts\") pod \"horizon-58d7dd5b56-nhlgz\" (UID: \"710a139f-bf12-4021-b702-3e40d49febf1\") " pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.289630 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/710a139f-bf12-4021-b702-3e40d49febf1-logs\") pod \"horizon-58d7dd5b56-nhlgz\" (UID: \"710a139f-bf12-4021-b702-3e40d49febf1\") " pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.289655 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgqvx\" (UniqueName: \"kubernetes.io/projected/710a139f-bf12-4021-b702-3e40d49febf1-kube-api-access-qgqvx\") pod \"horizon-58d7dd5b56-nhlgz\" (UID: \"710a139f-bf12-4021-b702-3e40d49febf1\") " pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.289687 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b66fe3a9-9849-4219-badb-a0cecbb2a388-logs\") pod \"horizon-566c6c8d88-h74t9\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.289711 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b66fe3a9-9849-4219-badb-a0cecbb2a388-combined-ca-bundle\") pod \"horizon-566c6c8d88-h74t9\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.289735 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk2s7\" (UniqueName: \"kubernetes.io/projected/b66fe3a9-9849-4219-badb-a0cecbb2a388-kube-api-access-vk2s7\") pod \"horizon-566c6c8d88-h74t9\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.290510 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b66fe3a9-9849-4219-badb-a0cecbb2a388-scripts\") pod \"horizon-566c6c8d88-h74t9\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.290732 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b66fe3a9-9849-4219-badb-a0cecbb2a388-logs\") pod \"horizon-566c6c8d88-h74t9\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.291744 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b66fe3a9-9849-4219-badb-a0cecbb2a388-config-data\") pod \"horizon-566c6c8d88-h74t9\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.296132 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b66fe3a9-9849-4219-badb-a0cecbb2a388-combined-ca-bundle\") pod \"horizon-566c6c8d88-h74t9\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.296710 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b66fe3a9-9849-4219-badb-a0cecbb2a388-horizon-secret-key\") pod \"horizon-566c6c8d88-h74t9\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.302590 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66fe3a9-9849-4219-badb-a0cecbb2a388-horizon-tls-certs\") pod \"horizon-566c6c8d88-h74t9\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.308308 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk2s7\" (UniqueName: \"kubernetes.io/projected/b66fe3a9-9849-4219-badb-a0cecbb2a388-kube-api-access-vk2s7\") pod \"horizon-566c6c8d88-h74t9\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.391795 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/710a139f-bf12-4021-b702-3e40d49febf1-config-data\") pod \"horizon-58d7dd5b56-nhlgz\" (UID: \"710a139f-bf12-4021-b702-3e40d49febf1\") " pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.392153 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/710a139f-bf12-4021-b702-3e40d49febf1-scripts\") pod \"horizon-58d7dd5b56-nhlgz\" (UID: \"710a139f-bf12-4021-b702-3e40d49febf1\") " pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.392175 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/710a139f-bf12-4021-b702-3e40d49febf1-combined-ca-bundle\") pod \"horizon-58d7dd5b56-nhlgz\" (UID: \"710a139f-bf12-4021-b702-3e40d49febf1\") " pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.392210 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/710a139f-bf12-4021-b702-3e40d49febf1-logs\") pod \"horizon-58d7dd5b56-nhlgz\" (UID: \"710a139f-bf12-4021-b702-3e40d49febf1\") " pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.392265 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgqvx\" (UniqueName: \"kubernetes.io/projected/710a139f-bf12-4021-b702-3e40d49febf1-kube-api-access-qgqvx\") pod \"horizon-58d7dd5b56-nhlgz\" (UID: \"710a139f-bf12-4021-b702-3e40d49febf1\") " pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.392805 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/710a139f-bf12-4021-b702-3e40d49febf1-logs\") pod \"horizon-58d7dd5b56-nhlgz\" (UID: \"710a139f-bf12-4021-b702-3e40d49febf1\") " pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.393003 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/710a139f-bf12-4021-b702-3e40d49febf1-scripts\") pod \"horizon-58d7dd5b56-nhlgz\" (UID: \"710a139f-bf12-4021-b702-3e40d49febf1\") " pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.393035 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/710a139f-bf12-4021-b702-3e40d49febf1-horizon-secret-key\") pod \"horizon-58d7dd5b56-nhlgz\" (UID: \"710a139f-bf12-4021-b702-3e40d49febf1\") " pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.393059 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/710a139f-bf12-4021-b702-3e40d49febf1-horizon-tls-certs\") pod \"horizon-58d7dd5b56-nhlgz\" (UID: \"710a139f-bf12-4021-b702-3e40d49febf1\") " pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.393298 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/710a139f-bf12-4021-b702-3e40d49febf1-config-data\") pod \"horizon-58d7dd5b56-nhlgz\" (UID: \"710a139f-bf12-4021-b702-3e40d49febf1\") " pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.396937 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/710a139f-bf12-4021-b702-3e40d49febf1-horizon-secret-key\") pod \"horizon-58d7dd5b56-nhlgz\" (UID: \"710a139f-bf12-4021-b702-3e40d49febf1\") " pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.397172 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/710a139f-bf12-4021-b702-3e40d49febf1-horizon-tls-certs\") pod \"horizon-58d7dd5b56-nhlgz\" (UID: \"710a139f-bf12-4021-b702-3e40d49febf1\") " pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.397970 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/710a139f-bf12-4021-b702-3e40d49febf1-combined-ca-bundle\") pod \"horizon-58d7dd5b56-nhlgz\" (UID: \"710a139f-bf12-4021-b702-3e40d49febf1\") " pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.408395 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.411961 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgqvx\" (UniqueName: \"kubernetes.io/projected/710a139f-bf12-4021-b702-3e40d49febf1-kube-api-access-qgqvx\") pod \"horizon-58d7dd5b56-nhlgz\" (UID: \"710a139f-bf12-4021-b702-3e40d49febf1\") " pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:20 crc kubenswrapper[4825]: I1007 19:18:20.478994 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:23 crc kubenswrapper[4825]: I1007 19:18:23.407426 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" podUID="89497435-87e0-4c79-9372-3ad2ae5c3161" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: connect: connection refused" Oct 07 19:18:24 crc kubenswrapper[4825]: E1007 19:18:24.491747 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Oct 07 19:18:24 crc kubenswrapper[4825]: E1007 19:18:24.492125 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v4ds8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-tnkcb_openstack(3dbca3ac-3960-4572-93c4-04276137f96a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 19:18:24 crc kubenswrapper[4825]: E1007 19:18:24.493379 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-tnkcb" podUID="3dbca3ac-3960-4572-93c4-04276137f96a" Oct 07 19:18:24 crc kubenswrapper[4825]: I1007 19:18:24.562417 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-v2xq9" Oct 07 19:18:24 crc kubenswrapper[4825]: I1007 19:18:24.701686 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-credential-keys\") pod \"a5b707ea-501b-470e-8016-22c333a3f90a\" (UID: \"a5b707ea-501b-470e-8016-22c333a3f90a\") " Oct 07 19:18:24 crc kubenswrapper[4825]: I1007 19:18:24.702046 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-scripts\") pod \"a5b707ea-501b-470e-8016-22c333a3f90a\" (UID: \"a5b707ea-501b-470e-8016-22c333a3f90a\") " Oct 07 19:18:24 crc kubenswrapper[4825]: I1007 19:18:24.702113 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-fernet-keys\") pod \"a5b707ea-501b-470e-8016-22c333a3f90a\" (UID: \"a5b707ea-501b-470e-8016-22c333a3f90a\") " Oct 07 19:18:24 crc kubenswrapper[4825]: I1007 19:18:24.702155 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-combined-ca-bundle\") pod \"a5b707ea-501b-470e-8016-22c333a3f90a\" (UID: \"a5b707ea-501b-470e-8016-22c333a3f90a\") " Oct 07 19:18:24 crc kubenswrapper[4825]: I1007 19:18:24.702324 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-config-data\") pod \"a5b707ea-501b-470e-8016-22c333a3f90a\" (UID: \"a5b707ea-501b-470e-8016-22c333a3f90a\") " Oct 07 19:18:24 crc kubenswrapper[4825]: I1007 19:18:24.702415 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwn5s\" (UniqueName: \"kubernetes.io/projected/a5b707ea-501b-470e-8016-22c333a3f90a-kube-api-access-xwn5s\") pod \"a5b707ea-501b-470e-8016-22c333a3f90a\" (UID: \"a5b707ea-501b-470e-8016-22c333a3f90a\") " Oct 07 19:18:24 crc kubenswrapper[4825]: I1007 19:18:24.707585 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-scripts" (OuterVolumeSpecName: "scripts") pod "a5b707ea-501b-470e-8016-22c333a3f90a" (UID: "a5b707ea-501b-470e-8016-22c333a3f90a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:24 crc kubenswrapper[4825]: I1007 19:18:24.708639 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a5b707ea-501b-470e-8016-22c333a3f90a" (UID: "a5b707ea-501b-470e-8016-22c333a3f90a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:24 crc kubenswrapper[4825]: I1007 19:18:24.711476 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5b707ea-501b-470e-8016-22c333a3f90a-kube-api-access-xwn5s" (OuterVolumeSpecName: "kube-api-access-xwn5s") pod "a5b707ea-501b-470e-8016-22c333a3f90a" (UID: "a5b707ea-501b-470e-8016-22c333a3f90a"). InnerVolumeSpecName "kube-api-access-xwn5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:18:24 crc kubenswrapper[4825]: I1007 19:18:24.721530 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a5b707ea-501b-470e-8016-22c333a3f90a" (UID: "a5b707ea-501b-470e-8016-22c333a3f90a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:24 crc kubenswrapper[4825]: I1007 19:18:24.732998 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-config-data" (OuterVolumeSpecName: "config-data") pod "a5b707ea-501b-470e-8016-22c333a3f90a" (UID: "a5b707ea-501b-470e-8016-22c333a3f90a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:24 crc kubenswrapper[4825]: I1007 19:18:24.743505 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5b707ea-501b-470e-8016-22c333a3f90a" (UID: "a5b707ea-501b-470e-8016-22c333a3f90a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:24 crc kubenswrapper[4825]: I1007 19:18:24.804526 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwn5s\" (UniqueName: \"kubernetes.io/projected/a5b707ea-501b-470e-8016-22c333a3f90a-kube-api-access-xwn5s\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:24 crc kubenswrapper[4825]: I1007 19:18:24.804587 4825 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:24 crc kubenswrapper[4825]: I1007 19:18:24.804603 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:24 crc kubenswrapper[4825]: I1007 19:18:24.804616 4825 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:24 crc kubenswrapper[4825]: I1007 19:18:24.804630 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:24 crc kubenswrapper[4825]: I1007 19:18:24.804647 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5b707ea-501b-470e-8016-22c333a3f90a-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:25 crc kubenswrapper[4825]: I1007 19:18:25.394702 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-v2xq9" Oct 07 19:18:25 crc kubenswrapper[4825]: I1007 19:18:25.397431 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-v2xq9" event={"ID":"a5b707ea-501b-470e-8016-22c333a3f90a","Type":"ContainerDied","Data":"df3d23a09390a80095c72bf4113deb62383b7d8468108206d74c79617273b982"} Oct 07 19:18:25 crc kubenswrapper[4825]: I1007 19:18:25.397477 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df3d23a09390a80095c72bf4113deb62383b7d8468108206d74c79617273b982" Oct 07 19:18:25 crc kubenswrapper[4825]: E1007 19:18:25.398753 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-tnkcb" podUID="3dbca3ac-3960-4572-93c4-04276137f96a" Oct 07 19:18:25 crc kubenswrapper[4825]: I1007 19:18:25.639741 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-v2xq9"] Oct 07 19:18:25 crc kubenswrapper[4825]: I1007 19:18:25.646693 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-v2xq9"] Oct 07 19:18:25 crc kubenswrapper[4825]: I1007 19:18:25.744873 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-q29ng"] Oct 07 19:18:25 crc kubenswrapper[4825]: E1007 19:18:25.788926 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5b707ea-501b-470e-8016-22c333a3f90a" containerName="keystone-bootstrap" Oct 07 19:18:25 crc kubenswrapper[4825]: I1007 19:18:25.788956 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5b707ea-501b-470e-8016-22c333a3f90a" containerName="keystone-bootstrap" Oct 07 19:18:25 crc kubenswrapper[4825]: I1007 19:18:25.789328 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5b707ea-501b-470e-8016-22c333a3f90a" containerName="keystone-bootstrap" Oct 07 19:18:25 crc kubenswrapper[4825]: I1007 19:18:25.790025 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-q29ng"] Oct 07 19:18:25 crc kubenswrapper[4825]: I1007 19:18:25.790117 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q29ng" Oct 07 19:18:25 crc kubenswrapper[4825]: I1007 19:18:25.795012 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 19:18:25 crc kubenswrapper[4825]: I1007 19:18:25.795210 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 19:18:25 crc kubenswrapper[4825]: I1007 19:18:25.795757 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-w9rfd" Oct 07 19:18:25 crc kubenswrapper[4825]: I1007 19:18:25.795859 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 19:18:25 crc kubenswrapper[4825]: I1007 19:18:25.816047 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5b707ea-501b-470e-8016-22c333a3f90a" path="/var/lib/kubelet/pods/a5b707ea-501b-470e-8016-22c333a3f90a/volumes" Oct 07 19:18:25 crc kubenswrapper[4825]: I1007 19:18:25.928953 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-combined-ca-bundle\") pod \"keystone-bootstrap-q29ng\" (UID: \"2d9e561b-95fb-4643-8452-01f9ae3475eb\") " pod="openstack/keystone-bootstrap-q29ng" Oct 07 19:18:25 crc kubenswrapper[4825]: I1007 19:18:25.929493 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mftgv\" (UniqueName: \"kubernetes.io/projected/2d9e561b-95fb-4643-8452-01f9ae3475eb-kube-api-access-mftgv\") pod \"keystone-bootstrap-q29ng\" (UID: \"2d9e561b-95fb-4643-8452-01f9ae3475eb\") " pod="openstack/keystone-bootstrap-q29ng" Oct 07 19:18:25 crc kubenswrapper[4825]: I1007 19:18:25.929618 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-fernet-keys\") pod \"keystone-bootstrap-q29ng\" (UID: \"2d9e561b-95fb-4643-8452-01f9ae3475eb\") " pod="openstack/keystone-bootstrap-q29ng" Oct 07 19:18:25 crc kubenswrapper[4825]: I1007 19:18:25.929720 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-config-data\") pod \"keystone-bootstrap-q29ng\" (UID: \"2d9e561b-95fb-4643-8452-01f9ae3475eb\") " pod="openstack/keystone-bootstrap-q29ng" Oct 07 19:18:25 crc kubenswrapper[4825]: I1007 19:18:25.930019 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-scripts\") pod \"keystone-bootstrap-q29ng\" (UID: \"2d9e561b-95fb-4643-8452-01f9ae3475eb\") " pod="openstack/keystone-bootstrap-q29ng" Oct 07 19:18:25 crc kubenswrapper[4825]: I1007 19:18:25.930047 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-credential-keys\") pod \"keystone-bootstrap-q29ng\" (UID: \"2d9e561b-95fb-4643-8452-01f9ae3475eb\") " pod="openstack/keystone-bootstrap-q29ng" Oct 07 19:18:26 crc kubenswrapper[4825]: I1007 19:18:26.032387 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mftgv\" (UniqueName: \"kubernetes.io/projected/2d9e561b-95fb-4643-8452-01f9ae3475eb-kube-api-access-mftgv\") pod \"keystone-bootstrap-q29ng\" (UID: \"2d9e561b-95fb-4643-8452-01f9ae3475eb\") " pod="openstack/keystone-bootstrap-q29ng" Oct 07 19:18:26 crc kubenswrapper[4825]: I1007 19:18:26.032490 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-fernet-keys\") pod \"keystone-bootstrap-q29ng\" (UID: \"2d9e561b-95fb-4643-8452-01f9ae3475eb\") " pod="openstack/keystone-bootstrap-q29ng" Oct 07 19:18:26 crc kubenswrapper[4825]: I1007 19:18:26.032533 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-config-data\") pod \"keystone-bootstrap-q29ng\" (UID: \"2d9e561b-95fb-4643-8452-01f9ae3475eb\") " pod="openstack/keystone-bootstrap-q29ng" Oct 07 19:18:26 crc kubenswrapper[4825]: I1007 19:18:26.032575 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-scripts\") pod \"keystone-bootstrap-q29ng\" (UID: \"2d9e561b-95fb-4643-8452-01f9ae3475eb\") " pod="openstack/keystone-bootstrap-q29ng" Oct 07 19:18:26 crc kubenswrapper[4825]: I1007 19:18:26.032596 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-credential-keys\") pod \"keystone-bootstrap-q29ng\" (UID: \"2d9e561b-95fb-4643-8452-01f9ae3475eb\") " pod="openstack/keystone-bootstrap-q29ng" Oct 07 19:18:26 crc kubenswrapper[4825]: I1007 19:18:26.032673 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-combined-ca-bundle\") pod \"keystone-bootstrap-q29ng\" (UID: \"2d9e561b-95fb-4643-8452-01f9ae3475eb\") " pod="openstack/keystone-bootstrap-q29ng" Oct 07 19:18:26 crc kubenswrapper[4825]: I1007 19:18:26.037513 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-scripts\") pod \"keystone-bootstrap-q29ng\" (UID: \"2d9e561b-95fb-4643-8452-01f9ae3475eb\") " pod="openstack/keystone-bootstrap-q29ng" Oct 07 19:18:26 crc kubenswrapper[4825]: I1007 19:18:26.038755 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-combined-ca-bundle\") pod \"keystone-bootstrap-q29ng\" (UID: \"2d9e561b-95fb-4643-8452-01f9ae3475eb\") " pod="openstack/keystone-bootstrap-q29ng" Oct 07 19:18:26 crc kubenswrapper[4825]: I1007 19:18:26.039355 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-credential-keys\") pod \"keystone-bootstrap-q29ng\" (UID: \"2d9e561b-95fb-4643-8452-01f9ae3475eb\") " pod="openstack/keystone-bootstrap-q29ng" Oct 07 19:18:26 crc kubenswrapper[4825]: I1007 19:18:26.040678 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-fernet-keys\") pod \"keystone-bootstrap-q29ng\" (UID: \"2d9e561b-95fb-4643-8452-01f9ae3475eb\") " pod="openstack/keystone-bootstrap-q29ng" Oct 07 19:18:26 crc kubenswrapper[4825]: I1007 19:18:26.041383 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-config-data\") pod \"keystone-bootstrap-q29ng\" (UID: \"2d9e561b-95fb-4643-8452-01f9ae3475eb\") " pod="openstack/keystone-bootstrap-q29ng" Oct 07 19:18:26 crc kubenswrapper[4825]: I1007 19:18:26.056895 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mftgv\" (UniqueName: \"kubernetes.io/projected/2d9e561b-95fb-4643-8452-01f9ae3475eb-kube-api-access-mftgv\") pod \"keystone-bootstrap-q29ng\" (UID: \"2d9e561b-95fb-4643-8452-01f9ae3475eb\") " pod="openstack/keystone-bootstrap-q29ng" Oct 07 19:18:26 crc kubenswrapper[4825]: I1007 19:18:26.119333 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q29ng" Oct 07 19:18:29 crc kubenswrapper[4825]: E1007 19:18:29.989338 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 07 19:18:29 crc kubenswrapper[4825]: E1007 19:18:29.990005 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nf8h656h559h5c7h665h8bh565h568h697hfdh67bhdbh695h4h8fh4h56ch66h698h86hfh677h5cbh8ch7ch64h58bh6bh58bhd7h676h695q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tqkr6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-54dd4bfd97-nctq4_openstack(11b2ace9-8b4b-4b81-ac2a-602ad2860c67): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 19:18:29 crc kubenswrapper[4825]: E1007 19:18:29.996095 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-54dd4bfd97-nctq4" podUID="11b2ace9-8b4b-4b81-ac2a-602ad2860c67" Oct 07 19:18:31 crc kubenswrapper[4825]: E1007 19:18:31.684412 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Oct 07 19:18:31 crc kubenswrapper[4825]: E1007 19:18:31.685690 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zv6s7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-8hldz_openstack(1abc8e94-8f1f-4195-b476-248206d004bf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 19:18:31 crc kubenswrapper[4825]: E1007 19:18:31.687157 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-8hldz" podUID="1abc8e94-8f1f-4195-b476-248206d004bf" Oct 07 19:18:31 crc kubenswrapper[4825]: E1007 19:18:31.704208 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 07 19:18:31 crc kubenswrapper[4825]: E1007 19:18:31.704507 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5f5h86h5d5h5bh558h54bh5f9h565h5dfhd6h587h68chffh5d8h645h5b7h6ch7fh9fh55h696h5d8h5c4h6fh97h699h5c9h567h654h56chbbh679q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m9hvs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7cb7996c8f-f7mzb_openstack(b8d75938-430b-4a74-87aa-50e8acdd63e0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 19:18:31 crc kubenswrapper[4825]: E1007 19:18:31.707606 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7cb7996c8f-f7mzb" podUID="b8d75938-430b-4a74-87aa-50e8acdd63e0" Oct 07 19:18:32 crc kubenswrapper[4825]: E1007 19:18:32.479043 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-8hldz" podUID="1abc8e94-8f1f-4195-b476-248206d004bf" Oct 07 19:18:33 crc kubenswrapper[4825]: I1007 19:18:33.408498 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" podUID="89497435-87e0-4c79-9372-3ad2ae5c3161" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: i/o timeout" Oct 07 19:18:38 crc kubenswrapper[4825]: I1007 19:18:38.409062 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" podUID="89497435-87e0-4c79-9372-3ad2ae5c3161" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: i/o timeout" Oct 07 19:18:42 crc kubenswrapper[4825]: E1007 19:18:42.977578 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 07 19:18:42 crc kubenswrapper[4825]: E1007 19:18:42.978157 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zrmz2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-vlkds_openstack(ba465067-0e79-4d52-bc56-a4b60767eb7d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 19:18:42 crc kubenswrapper[4825]: E1007 19:18:42.979713 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-vlkds" podUID="ba465067-0e79-4d52-bc56-a4b60767eb7d" Oct 07 19:18:43 crc kubenswrapper[4825]: I1007 19:18:43.409904 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" podUID="89497435-87e0-4c79-9372-3ad2ae5c3161" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: i/o timeout" Oct 07 19:18:43 crc kubenswrapper[4825]: E1007 19:18:43.602707 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-vlkds" podUID="ba465067-0e79-4d52-bc56-a4b60767eb7d" Oct 07 19:18:44 crc kubenswrapper[4825]: E1007 19:18:44.592134 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 07 19:18:44 crc kubenswrapper[4825]: E1007 19:18:44.592471 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n67dh8fh79h98h649h5c9h66dh599h5bch54chfch5dfh544h5f7h577h59fh58dh65h576h667h55h5fbh569h5fbh577h5f8h57bh578h546h674h89hf9q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qml9s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7577885557-tllnd_openstack(39272589-06ba-4924-9f8d-a72bc499ba6f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 19:18:44 crc kubenswrapper[4825]: E1007 19:18:44.603395 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7577885557-tllnd" podUID="39272589-06ba-4924-9f8d-a72bc499ba6f" Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.741311 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54dd4bfd97-nctq4" Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.747680 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.758560 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cb7996c8f-f7mzb" Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.923437 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnxrp\" (UniqueName: \"kubernetes.io/projected/89497435-87e0-4c79-9372-3ad2ae5c3161-kube-api-access-pnxrp\") pod \"89497435-87e0-4c79-9372-3ad2ae5c3161\" (UID: \"89497435-87e0-4c79-9372-3ad2ae5c3161\") " Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.923491 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b8d75938-430b-4a74-87aa-50e8acdd63e0-horizon-secret-key\") pod \"b8d75938-430b-4a74-87aa-50e8acdd63e0\" (UID: \"b8d75938-430b-4a74-87aa-50e8acdd63e0\") " Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.923557 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-dns-swift-storage-0\") pod \"89497435-87e0-4c79-9372-3ad2ae5c3161\" (UID: \"89497435-87e0-4c79-9372-3ad2ae5c3161\") " Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.923622 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8d75938-430b-4a74-87aa-50e8acdd63e0-scripts\") pod \"b8d75938-430b-4a74-87aa-50e8acdd63e0\" (UID: \"b8d75938-430b-4a74-87aa-50e8acdd63e0\") " Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.923653 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-scripts\") pod \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\" (UID: \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\") " Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.923713 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-ovsdbserver-nb\") pod \"89497435-87e0-4c79-9372-3ad2ae5c3161\" (UID: \"89497435-87e0-4c79-9372-3ad2ae5c3161\") " Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.923780 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-config-data\") pod \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\" (UID: \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\") " Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.923822 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqkr6\" (UniqueName: \"kubernetes.io/projected/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-kube-api-access-tqkr6\") pod \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\" (UID: \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\") " Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.923858 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9hvs\" (UniqueName: \"kubernetes.io/projected/b8d75938-430b-4a74-87aa-50e8acdd63e0-kube-api-access-m9hvs\") pod \"b8d75938-430b-4a74-87aa-50e8acdd63e0\" (UID: \"b8d75938-430b-4a74-87aa-50e8acdd63e0\") " Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.923924 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-dns-svc\") pod \"89497435-87e0-4c79-9372-3ad2ae5c3161\" (UID: \"89497435-87e0-4c79-9372-3ad2ae5c3161\") " Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.923982 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-horizon-secret-key\") pod \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\" (UID: \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\") " Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.924024 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-ovsdbserver-sb\") pod \"89497435-87e0-4c79-9372-3ad2ae5c3161\" (UID: \"89497435-87e0-4c79-9372-3ad2ae5c3161\") " Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.924064 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-config\") pod \"89497435-87e0-4c79-9372-3ad2ae5c3161\" (UID: \"89497435-87e0-4c79-9372-3ad2ae5c3161\") " Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.924098 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8d75938-430b-4a74-87aa-50e8acdd63e0-logs\") pod \"b8d75938-430b-4a74-87aa-50e8acdd63e0\" (UID: \"b8d75938-430b-4a74-87aa-50e8acdd63e0\") " Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.924381 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8d75938-430b-4a74-87aa-50e8acdd63e0-config-data\") pod \"b8d75938-430b-4a74-87aa-50e8acdd63e0\" (UID: \"b8d75938-430b-4a74-87aa-50e8acdd63e0\") " Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.924421 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-logs\") pod \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\" (UID: \"11b2ace9-8b4b-4b81-ac2a-602ad2860c67\") " Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.925177 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8d75938-430b-4a74-87aa-50e8acdd63e0-scripts" (OuterVolumeSpecName: "scripts") pod "b8d75938-430b-4a74-87aa-50e8acdd63e0" (UID: "b8d75938-430b-4a74-87aa-50e8acdd63e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.925218 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-scripts" (OuterVolumeSpecName: "scripts") pod "11b2ace9-8b4b-4b81-ac2a-602ad2860c67" (UID: "11b2ace9-8b4b-4b81-ac2a-602ad2860c67"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.925792 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-logs" (OuterVolumeSpecName: "logs") pod "11b2ace9-8b4b-4b81-ac2a-602ad2860c67" (UID: "11b2ace9-8b4b-4b81-ac2a-602ad2860c67"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.926172 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8d75938-430b-4a74-87aa-50e8acdd63e0-logs" (OuterVolumeSpecName: "logs") pod "b8d75938-430b-4a74-87aa-50e8acdd63e0" (UID: "b8d75938-430b-4a74-87aa-50e8acdd63e0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.926594 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-config-data" (OuterVolumeSpecName: "config-data") pod "11b2ace9-8b4b-4b81-ac2a-602ad2860c67" (UID: "11b2ace9-8b4b-4b81-ac2a-602ad2860c67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.927035 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8d75938-430b-4a74-87aa-50e8acdd63e0-config-data" (OuterVolumeSpecName: "config-data") pod "b8d75938-430b-4a74-87aa-50e8acdd63e0" (UID: "b8d75938-430b-4a74-87aa-50e8acdd63e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.930322 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "11b2ace9-8b4b-4b81-ac2a-602ad2860c67" (UID: "11b2ace9-8b4b-4b81-ac2a-602ad2860c67"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.931262 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8d75938-430b-4a74-87aa-50e8acdd63e0-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b8d75938-430b-4a74-87aa-50e8acdd63e0" (UID: "b8d75938-430b-4a74-87aa-50e8acdd63e0"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.932318 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-kube-api-access-tqkr6" (OuterVolumeSpecName: "kube-api-access-tqkr6") pod "11b2ace9-8b4b-4b81-ac2a-602ad2860c67" (UID: "11b2ace9-8b4b-4b81-ac2a-602ad2860c67"). InnerVolumeSpecName "kube-api-access-tqkr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.932531 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8d75938-430b-4a74-87aa-50e8acdd63e0-kube-api-access-m9hvs" (OuterVolumeSpecName: "kube-api-access-m9hvs") pod "b8d75938-430b-4a74-87aa-50e8acdd63e0" (UID: "b8d75938-430b-4a74-87aa-50e8acdd63e0"). InnerVolumeSpecName "kube-api-access-m9hvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.932660 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89497435-87e0-4c79-9372-3ad2ae5c3161-kube-api-access-pnxrp" (OuterVolumeSpecName: "kube-api-access-pnxrp") pod "89497435-87e0-4c79-9372-3ad2ae5c3161" (UID: "89497435-87e0-4c79-9372-3ad2ae5c3161"). InnerVolumeSpecName "kube-api-access-pnxrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.972940 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "89497435-87e0-4c79-9372-3ad2ae5c3161" (UID: "89497435-87e0-4c79-9372-3ad2ae5c3161"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.983713 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-config" (OuterVolumeSpecName: "config") pod "89497435-87e0-4c79-9372-3ad2ae5c3161" (UID: "89497435-87e0-4c79-9372-3ad2ae5c3161"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.983779 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "89497435-87e0-4c79-9372-3ad2ae5c3161" (UID: "89497435-87e0-4c79-9372-3ad2ae5c3161"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.993061 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "89497435-87e0-4c79-9372-3ad2ae5c3161" (UID: "89497435-87e0-4c79-9372-3ad2ae5c3161"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:44 crc kubenswrapper[4825]: I1007 19:18:44.994812 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "89497435-87e0-4c79-9372-3ad2ae5c3161" (UID: "89497435-87e0-4c79-9372-3ad2ae5c3161"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.026934 4825 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.026976 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.026995 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.027012 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8d75938-430b-4a74-87aa-50e8acdd63e0-logs\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.027029 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8d75938-430b-4a74-87aa-50e8acdd63e0-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.027044 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-logs\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.027058 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnxrp\" (UniqueName: \"kubernetes.io/projected/89497435-87e0-4c79-9372-3ad2ae5c3161-kube-api-access-pnxrp\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.027077 4825 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b8d75938-430b-4a74-87aa-50e8acdd63e0-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.027092 4825 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.027106 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8d75938-430b-4a74-87aa-50e8acdd63e0-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.027158 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.027173 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.027188 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.027203 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqkr6\" (UniqueName: \"kubernetes.io/projected/11b2ace9-8b4b-4b81-ac2a-602ad2860c67-kube-api-access-tqkr6\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.027217 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9hvs\" (UniqueName: \"kubernetes.io/projected/b8d75938-430b-4a74-87aa-50e8acdd63e0-kube-api-access-m9hvs\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.027249 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89497435-87e0-4c79-9372-3ad2ae5c3161-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:45 crc kubenswrapper[4825]: E1007 19:18:45.451802 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Oct 07 19:18:45 crc kubenswrapper[4825]: E1007 19:18:45.452046 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nb6h547hf5h86h654hddh644h5h57h85h65fh9fh588h5dch559hc5h5fbh5bh5d5h588h579h575h66h666h5f4h67chbdh9ch55h74h75h659q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x5srm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(5ec7dda1-a8ec-4aa6-a3be-25c200b51d15): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.619495 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54dd4bfd97-nctq4" event={"ID":"11b2ace9-8b4b-4b81-ac2a-602ad2860c67","Type":"ContainerDied","Data":"5a4df37f8bf2fa45a52c4f5850cf77ea4e352dcc286646588d579ffe92d7ed30"} Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.619535 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54dd4bfd97-nctq4" Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.621641 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" event={"ID":"89497435-87e0-4c79-9372-3ad2ae5c3161","Type":"ContainerDied","Data":"0bf9f0799ba26f8ca9332cd60a72bef02be9ad4632a100fa776c4cbc4565e553"} Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.621713 4825 scope.go:117] "RemoveContainer" containerID="9e19c485d0704e21677e9d62b4df3ec1cbdd7d5475d0fe17901a8d7954602f37" Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.621847 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.625992 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cb7996c8f-f7mzb" event={"ID":"b8d75938-430b-4a74-87aa-50e8acdd63e0","Type":"ContainerDied","Data":"30383b2c64d209329393a118c552db205ef59934d5ef171b6543faa48940c7a0"} Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.626021 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cb7996c8f-f7mzb" Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.810367 4825 scope.go:117] "RemoveContainer" containerID="9b63d1280eb9502ddd8627486ff116af835d88b173f9e7a88616c9b2fd951e8b" Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.875223 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54dd4bfd97-nctq4"] Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.904213 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-54dd4bfd97-nctq4"] Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.927898 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7cb7996c8f-f7mzb"] Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.933510 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7cb7996c8f-f7mzb"] Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.945291 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-qwgl2"] Oct 07 19:18:45 crc kubenswrapper[4825]: I1007 19:18:45.958120 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-qwgl2"] Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.167537 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7577885557-tllnd" Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.187719 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.238514 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-566c6c8d88-h74t9"] Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.249968 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58d7dd5b56-nhlgz"] Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.262360 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/39272589-06ba-4924-9f8d-a72bc499ba6f-config-data\") pod \"39272589-06ba-4924-9f8d-a72bc499ba6f\" (UID: \"39272589-06ba-4924-9f8d-a72bc499ba6f\") " Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.262637 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qml9s\" (UniqueName: \"kubernetes.io/projected/39272589-06ba-4924-9f8d-a72bc499ba6f-kube-api-access-qml9s\") pod \"39272589-06ba-4924-9f8d-a72bc499ba6f\" (UID: \"39272589-06ba-4924-9f8d-a72bc499ba6f\") " Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.262716 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39272589-06ba-4924-9f8d-a72bc499ba6f-logs\") pod \"39272589-06ba-4924-9f8d-a72bc499ba6f\" (UID: \"39272589-06ba-4924-9f8d-a72bc499ba6f\") " Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.262758 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39272589-06ba-4924-9f8d-a72bc499ba6f-scripts\") pod \"39272589-06ba-4924-9f8d-a72bc499ba6f\" (UID: \"39272589-06ba-4924-9f8d-a72bc499ba6f\") " Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.262867 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/39272589-06ba-4924-9f8d-a72bc499ba6f-horizon-secret-key\") pod \"39272589-06ba-4924-9f8d-a72bc499ba6f\" (UID: \"39272589-06ba-4924-9f8d-a72bc499ba6f\") " Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.263314 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39272589-06ba-4924-9f8d-a72bc499ba6f-config-data" (OuterVolumeSpecName: "config-data") pod "39272589-06ba-4924-9f8d-a72bc499ba6f" (UID: "39272589-06ba-4924-9f8d-a72bc499ba6f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.263748 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39272589-06ba-4924-9f8d-a72bc499ba6f-logs" (OuterVolumeSpecName: "logs") pod "39272589-06ba-4924-9f8d-a72bc499ba6f" (UID: "39272589-06ba-4924-9f8d-a72bc499ba6f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.264328 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39272589-06ba-4924-9f8d-a72bc499ba6f-scripts" (OuterVolumeSpecName: "scripts") pod "39272589-06ba-4924-9f8d-a72bc499ba6f" (UID: "39272589-06ba-4924-9f8d-a72bc499ba6f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.265442 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/39272589-06ba-4924-9f8d-a72bc499ba6f-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.265474 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39272589-06ba-4924-9f8d-a72bc499ba6f-logs\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.265487 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39272589-06ba-4924-9f8d-a72bc499ba6f-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.266718 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-5qdgx"] Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.269484 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39272589-06ba-4924-9f8d-a72bc499ba6f-kube-api-access-qml9s" (OuterVolumeSpecName: "kube-api-access-qml9s") pod "39272589-06ba-4924-9f8d-a72bc499ba6f" (UID: "39272589-06ba-4924-9f8d-a72bc499ba6f"). InnerVolumeSpecName "kube-api-access-qml9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.270089 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39272589-06ba-4924-9f8d-a72bc499ba6f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "39272589-06ba-4924-9f8d-a72bc499ba6f" (UID: "39272589-06ba-4924-9f8d-a72bc499ba6f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.320304 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.367550 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qml9s\" (UniqueName: \"kubernetes.io/projected/39272589-06ba-4924-9f8d-a72bc499ba6f-kube-api-access-qml9s\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.367587 4825 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/39272589-06ba-4924-9f8d-a72bc499ba6f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.414063 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-q29ng"] Oct 07 19:18:46 crc kubenswrapper[4825]: W1007 19:18:46.424101 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d9e561b_95fb_4643_8452_01f9ae3475eb.slice/crio-ce84fde6ad9bf08730193ea5e06fa45b5d4c2abfe6627591159ec96c77bfdebd WatchSource:0}: Error finding container ce84fde6ad9bf08730193ea5e06fa45b5d4c2abfe6627591159ec96c77bfdebd: Status 404 returned error can't find the container with id ce84fde6ad9bf08730193ea5e06fa45b5d4c2abfe6627591159ec96c77bfdebd Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.637114 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tnkcb" event={"ID":"3dbca3ac-3960-4572-93c4-04276137f96a","Type":"ContainerStarted","Data":"9303478f08a4b2c0ee54d55314c59e8480daf1fe22eea067e520f7bdce9d2beb"} Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.639116 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q29ng" event={"ID":"2d9e561b-95fb-4643-8452-01f9ae3475eb","Type":"ContainerStarted","Data":"b3983c1456971ed1e8946e43305af7caa014da109e1daed3fd8bc3d89db94d6b"} Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.639224 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q29ng" event={"ID":"2d9e561b-95fb-4643-8452-01f9ae3475eb","Type":"ContainerStarted","Data":"ce84fde6ad9bf08730193ea5e06fa45b5d4c2abfe6627591159ec96c77bfdebd"} Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.641755 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58d7dd5b56-nhlgz" event={"ID":"710a139f-bf12-4021-b702-3e40d49febf1","Type":"ContainerStarted","Data":"8d40dc613170c3af99124b22be7985638d99a3bf0f3d45c4e7e8707c9c377ad5"} Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.643926 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8ec02c95-3434-4e6a-b6a9-713f0ef83dba","Type":"ContainerStarted","Data":"68f9934cc1f2bf3fb8cee1a2377d0548432ede6ff22fd9fdad0cfa55cda831f3"} Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.646214 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-566c6c8d88-h74t9" event={"ID":"b66fe3a9-9849-4219-badb-a0cecbb2a388","Type":"ContainerStarted","Data":"19a90f1f23ed8a8f03b71fa604a7882b2bb5e3ffd46fc3e08e208306b33d96f5"} Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.648088 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8hldz" event={"ID":"1abc8e94-8f1f-4195-b476-248206d004bf","Type":"ContainerStarted","Data":"f8e9a2c8b3a4847f51327cedb19b1b2435b747f56dfcd88d3b2654273e64326d"} Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.649581 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4c0c4637-00a5-4fd1-8105-4b36770b00b6","Type":"ContainerStarted","Data":"e5263e54a5f92be2cf6b45fb506a9e2086143a54d6ecffbf7ad73793b430cafe"} Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.659985 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-tnkcb" podStartSLOduration=2.844295766 podStartE2EDuration="39.659968487s" podCreationTimestamp="2025-10-07 19:18:07 +0000 UTC" firstStartedPulling="2025-10-07 19:18:08.777323234 +0000 UTC m=+1077.599361871" lastFinishedPulling="2025-10-07 19:18:45.592995955 +0000 UTC m=+1114.415034592" observedRunningTime="2025-10-07 19:18:46.654297047 +0000 UTC m=+1115.476335704" watchObservedRunningTime="2025-10-07 19:18:46.659968487 +0000 UTC m=+1115.482007124" Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.663153 4825 generic.go:334] "Generic (PLEG): container finished" podID="5fddc514-0ec2-4022-9971-75c8dd44ef6c" containerID="97c2f3ebed0c01295a9c8b616040c0b0432436d19cbaa6a6f706a195f3ea38b1" exitCode=0 Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.663302 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" event={"ID":"5fddc514-0ec2-4022-9971-75c8dd44ef6c","Type":"ContainerDied","Data":"97c2f3ebed0c01295a9c8b616040c0b0432436d19cbaa6a6f706a195f3ea38b1"} Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.663323 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" event={"ID":"5fddc514-0ec2-4022-9971-75c8dd44ef6c","Type":"ContainerStarted","Data":"7b5cfc253f6dbf1a78aafd87ad9eb87f1c51a98495a436946410d69532a950d6"} Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.666765 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7577885557-tllnd" event={"ID":"39272589-06ba-4924-9f8d-a72bc499ba6f","Type":"ContainerDied","Data":"e2d3764bb8b937a36a0d4284bc81b08c89d163d2876024b89f74bd7ab575debc"} Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.666822 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7577885557-tllnd" Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.690542 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-q29ng" podStartSLOduration=21.690524517 podStartE2EDuration="21.690524517s" podCreationTimestamp="2025-10-07 19:18:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:18:46.66982846 +0000 UTC m=+1115.491867097" watchObservedRunningTime="2025-10-07 19:18:46.690524517 +0000 UTC m=+1115.512563144" Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.693004 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-8hldz" podStartSLOduration=3.213677502 podStartE2EDuration="39.692997116s" podCreationTimestamp="2025-10-07 19:18:07 +0000 UTC" firstStartedPulling="2025-10-07 19:18:09.116625565 +0000 UTC m=+1077.938664202" lastFinishedPulling="2025-10-07 19:18:45.595945179 +0000 UTC m=+1114.417983816" observedRunningTime="2025-10-07 19:18:46.682593805 +0000 UTC m=+1115.504632442" watchObservedRunningTime="2025-10-07 19:18:46.692997116 +0000 UTC m=+1115.515035753" Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.791513 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7577885557-tllnd"] Oct 07 19:18:46 crc kubenswrapper[4825]: I1007 19:18:46.798488 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7577885557-tllnd"] Oct 07 19:18:47 crc kubenswrapper[4825]: I1007 19:18:47.681336 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4c0c4637-00a5-4fd1-8105-4b36770b00b6","Type":"ContainerStarted","Data":"9159cc4f1218a348b0fddfb9b8831d64e88457b45ce2b88ebeb5d91be51840a5"} Oct 07 19:18:47 crc kubenswrapper[4825]: I1007 19:18:47.685148 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" event={"ID":"5fddc514-0ec2-4022-9971-75c8dd44ef6c","Type":"ContainerStarted","Data":"100e248de1d6a802045068c3fbc0b33e4733b21ab1314fb1e65bb9023e48c456"} Oct 07 19:18:47 crc kubenswrapper[4825]: I1007 19:18:47.685242 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" Oct 07 19:18:47 crc kubenswrapper[4825]: I1007 19:18:47.688369 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58d7dd5b56-nhlgz" event={"ID":"710a139f-bf12-4021-b702-3e40d49febf1","Type":"ContainerStarted","Data":"290e3b9616f1ba8b9c8cd7f6bdb27d38f6956a76fb5bd9d0b7d12ce9bbcf7800"} Oct 07 19:18:47 crc kubenswrapper[4825]: I1007 19:18:47.688412 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58d7dd5b56-nhlgz" event={"ID":"710a139f-bf12-4021-b702-3e40d49febf1","Type":"ContainerStarted","Data":"cc3497038523c4c8ef0a462cc8104367cfd8a48ebeafc3144ed11bcc7210e377"} Oct 07 19:18:47 crc kubenswrapper[4825]: I1007 19:18:47.695603 4825 generic.go:334] "Generic (PLEG): container finished" podID="019abaa0-c821-4f8c-a195-a9ea7bc81f8b" containerID="5468c16d9290fd9e1ab84b25c93a6102553ae19cdd636995b2fe7c94302ed5b1" exitCode=0 Oct 07 19:18:47 crc kubenswrapper[4825]: I1007 19:18:47.695681 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cccsz" event={"ID":"019abaa0-c821-4f8c-a195-a9ea7bc81f8b","Type":"ContainerDied","Data":"5468c16d9290fd9e1ab84b25c93a6102553ae19cdd636995b2fe7c94302ed5b1"} Oct 07 19:18:47 crc kubenswrapper[4825]: I1007 19:18:47.698334 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8ec02c95-3434-4e6a-b6a9-713f0ef83dba","Type":"ContainerStarted","Data":"c58ab5ceae598ade1461e5a6b9d7a169835896448e2f08f75b0bfdce2bc4c124"} Oct 07 19:18:47 crc kubenswrapper[4825]: I1007 19:18:47.698406 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8ec02c95-3434-4e6a-b6a9-713f0ef83dba" containerName="glance-log" containerID="cri-o://c58ab5ceae598ade1461e5a6b9d7a169835896448e2f08f75b0bfdce2bc4c124" gracePeriod=30 Oct 07 19:18:47 crc kubenswrapper[4825]: I1007 19:18:47.698423 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8ec02c95-3434-4e6a-b6a9-713f0ef83dba" containerName="glance-httpd" containerID="cri-o://89c7c9af772ad2e4421d8dbdefe3cac544149ae63151c98f63b86f84f9178013" gracePeriod=30 Oct 07 19:18:47 crc kubenswrapper[4825]: I1007 19:18:47.707375 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15","Type":"ContainerStarted","Data":"ab1230628bfa9906331a75a996428b1e675abe1e75bda8d046a0a80ebd77346f"} Oct 07 19:18:47 crc kubenswrapper[4825]: I1007 19:18:47.710539 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" podStartSLOduration=35.710527567 podStartE2EDuration="35.710527567s" podCreationTimestamp="2025-10-07 19:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:18:47.707906734 +0000 UTC m=+1116.529945421" watchObservedRunningTime="2025-10-07 19:18:47.710527567 +0000 UTC m=+1116.532566194" Oct 07 19:18:47 crc kubenswrapper[4825]: I1007 19:18:47.717479 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-566c6c8d88-h74t9" event={"ID":"b66fe3a9-9849-4219-badb-a0cecbb2a388","Type":"ContainerStarted","Data":"b617ed135554f80f64aa57a58a8e8eeb06038dec9d1d88a161cbbda0bc1b3b20"} Oct 07 19:18:47 crc kubenswrapper[4825]: I1007 19:18:47.717519 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-566c6c8d88-h74t9" event={"ID":"b66fe3a9-9849-4219-badb-a0cecbb2a388","Type":"ContainerStarted","Data":"aea12da3989f209c9f5f62775f725a8aaf79a64c22af740047e00c937865a840"} Oct 07 19:18:47 crc kubenswrapper[4825]: I1007 19:18:47.740960 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-58d7dd5b56-nhlgz" podStartSLOduration=26.866415343 podStartE2EDuration="27.740946003s" podCreationTimestamp="2025-10-07 19:18:20 +0000 UTC" firstStartedPulling="2025-10-07 19:18:46.273103152 +0000 UTC m=+1115.095141789" lastFinishedPulling="2025-10-07 19:18:47.147633812 +0000 UTC m=+1115.969672449" observedRunningTime="2025-10-07 19:18:47.737019889 +0000 UTC m=+1116.559058526" watchObservedRunningTime="2025-10-07 19:18:47.740946003 +0000 UTC m=+1116.562984650" Oct 07 19:18:47 crc kubenswrapper[4825]: I1007 19:18:47.846635 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11b2ace9-8b4b-4b81-ac2a-602ad2860c67" path="/var/lib/kubelet/pods/11b2ace9-8b4b-4b81-ac2a-602ad2860c67/volumes" Oct 07 19:18:47 crc kubenswrapper[4825]: I1007 19:18:47.847092 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39272589-06ba-4924-9f8d-a72bc499ba6f" path="/var/lib/kubelet/pods/39272589-06ba-4924-9f8d-a72bc499ba6f/volumes" Oct 07 19:18:47 crc kubenswrapper[4825]: I1007 19:18:47.850530 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89497435-87e0-4c79-9372-3ad2ae5c3161" path="/var/lib/kubelet/pods/89497435-87e0-4c79-9372-3ad2ae5c3161/volumes" Oct 07 19:18:47 crc kubenswrapper[4825]: I1007 19:18:47.851177 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8d75938-430b-4a74-87aa-50e8acdd63e0" path="/var/lib/kubelet/pods/b8d75938-430b-4a74-87aa-50e8acdd63e0/volumes" Oct 07 19:18:47 crc kubenswrapper[4825]: I1007 19:18:47.872208 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-566c6c8d88-h74t9" podStartSLOduration=26.984899554 podStartE2EDuration="27.872193431s" podCreationTimestamp="2025-10-07 19:18:20 +0000 UTC" firstStartedPulling="2025-10-07 19:18:46.261877615 +0000 UTC m=+1115.083916252" lastFinishedPulling="2025-10-07 19:18:47.149171472 +0000 UTC m=+1115.971210129" observedRunningTime="2025-10-07 19:18:47.871896052 +0000 UTC m=+1116.693934729" watchObservedRunningTime="2025-10-07 19:18:47.872193431 +0000 UTC m=+1116.694232068" Oct 07 19:18:47 crc kubenswrapper[4825]: I1007 19:18:47.884460 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=35.88443813 podStartE2EDuration="35.88443813s" podCreationTimestamp="2025-10-07 19:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:18:47.820144588 +0000 UTC m=+1116.642183225" watchObservedRunningTime="2025-10-07 19:18:47.88443813 +0000 UTC m=+1116.706476767" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.306964 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.406779 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.406843 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-combined-ca-bundle\") pod \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.406872 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-httpd-run\") pod \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.406945 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-scripts\") pod \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.407050 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-config-data\") pod \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.407097 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z565j\" (UniqueName: \"kubernetes.io/projected/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-kube-api-access-z565j\") pod \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.407119 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-logs\") pod \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\" (UID: \"8ec02c95-3434-4e6a-b6a9-713f0ef83dba\") " Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.407753 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-logs" (OuterVolumeSpecName: "logs") pod "8ec02c95-3434-4e6a-b6a9-713f0ef83dba" (UID: "8ec02c95-3434-4e6a-b6a9-713f0ef83dba"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.409115 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8ec02c95-3434-4e6a-b6a9-713f0ef83dba" (UID: "8ec02c95-3434-4e6a-b6a9-713f0ef83dba"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.410929 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76fcf4b695-qwgl2" podUID="89497435-87e0-4c79-9372-3ad2ae5c3161" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: i/o timeout" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.413856 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "8ec02c95-3434-4e6a-b6a9-713f0ef83dba" (UID: "8ec02c95-3434-4e6a-b6a9-713f0ef83dba"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.413909 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-kube-api-access-z565j" (OuterVolumeSpecName: "kube-api-access-z565j") pod "8ec02c95-3434-4e6a-b6a9-713f0ef83dba" (UID: "8ec02c95-3434-4e6a-b6a9-713f0ef83dba"). InnerVolumeSpecName "kube-api-access-z565j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.414040 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-scripts" (OuterVolumeSpecName: "scripts") pod "8ec02c95-3434-4e6a-b6a9-713f0ef83dba" (UID: "8ec02c95-3434-4e6a-b6a9-713f0ef83dba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.436086 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ec02c95-3434-4e6a-b6a9-713f0ef83dba" (UID: "8ec02c95-3434-4e6a-b6a9-713f0ef83dba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.456451 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-config-data" (OuterVolumeSpecName: "config-data") pod "8ec02c95-3434-4e6a-b6a9-713f0ef83dba" (UID: "8ec02c95-3434-4e6a-b6a9-713f0ef83dba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.515913 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.516221 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.516351 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z565j\" (UniqueName: \"kubernetes.io/projected/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-kube-api-access-z565j\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.516449 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-logs\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.516691 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.516860 4825 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.516982 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ec02c95-3434-4e6a-b6a9-713f0ef83dba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.536332 4825 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.619951 4825 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.730629 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4c0c4637-00a5-4fd1-8105-4b36770b00b6","Type":"ContainerStarted","Data":"c050b799dcd9529ec2cb1d88f2feafbcfb97f0f2a3bc760020f67f1099160474"} Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.730820 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4c0c4637-00a5-4fd1-8105-4b36770b00b6" containerName="glance-log" containerID="cri-o://9159cc4f1218a348b0fddfb9b8831d64e88457b45ce2b88ebeb5d91be51840a5" gracePeriod=30 Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.731013 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4c0c4637-00a5-4fd1-8105-4b36770b00b6" containerName="glance-httpd" containerID="cri-o://c050b799dcd9529ec2cb1d88f2feafbcfb97f0f2a3bc760020f67f1099160474" gracePeriod=30 Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.733999 4825 generic.go:334] "Generic (PLEG): container finished" podID="8ec02c95-3434-4e6a-b6a9-713f0ef83dba" containerID="89c7c9af772ad2e4421d8dbdefe3cac544149ae63151c98f63b86f84f9178013" exitCode=143 Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.734027 4825 generic.go:334] "Generic (PLEG): container finished" podID="8ec02c95-3434-4e6a-b6a9-713f0ef83dba" containerID="c58ab5ceae598ade1461e5a6b9d7a169835896448e2f08f75b0bfdce2bc4c124" exitCode=143 Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.734065 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8ec02c95-3434-4e6a-b6a9-713f0ef83dba","Type":"ContainerDied","Data":"89c7c9af772ad2e4421d8dbdefe3cac544149ae63151c98f63b86f84f9178013"} Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.734092 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8ec02c95-3434-4e6a-b6a9-713f0ef83dba","Type":"ContainerDied","Data":"c58ab5ceae598ade1461e5a6b9d7a169835896448e2f08f75b0bfdce2bc4c124"} Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.734103 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8ec02c95-3434-4e6a-b6a9-713f0ef83dba","Type":"ContainerDied","Data":"68f9934cc1f2bf3fb8cee1a2377d0548432ede6ff22fd9fdad0cfa55cda831f3"} Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.734118 4825 scope.go:117] "RemoveContainer" containerID="89c7c9af772ad2e4421d8dbdefe3cac544149ae63151c98f63b86f84f9178013" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.734479 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.738338 4825 generic.go:334] "Generic (PLEG): container finished" podID="1abc8e94-8f1f-4195-b476-248206d004bf" containerID="f8e9a2c8b3a4847f51327cedb19b1b2435b747f56dfcd88d3b2654273e64326d" exitCode=0 Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.738570 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8hldz" event={"ID":"1abc8e94-8f1f-4195-b476-248206d004bf","Type":"ContainerDied","Data":"f8e9a2c8b3a4847f51327cedb19b1b2435b747f56dfcd88d3b2654273e64326d"} Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.761709 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=36.761688447 podStartE2EDuration="36.761688447s" podCreationTimestamp="2025-10-07 19:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:18:48.753066753 +0000 UTC m=+1117.575105420" watchObservedRunningTime="2025-10-07 19:18:48.761688447 +0000 UTC m=+1117.583727084" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.782088 4825 scope.go:117] "RemoveContainer" containerID="c58ab5ceae598ade1461e5a6b9d7a169835896448e2f08f75b0bfdce2bc4c124" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.814059 4825 scope.go:117] "RemoveContainer" containerID="89c7c9af772ad2e4421d8dbdefe3cac544149ae63151c98f63b86f84f9178013" Oct 07 19:18:48 crc kubenswrapper[4825]: E1007 19:18:48.814809 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89c7c9af772ad2e4421d8dbdefe3cac544149ae63151c98f63b86f84f9178013\": container with ID starting with 89c7c9af772ad2e4421d8dbdefe3cac544149ae63151c98f63b86f84f9178013 not found: ID does not exist" containerID="89c7c9af772ad2e4421d8dbdefe3cac544149ae63151c98f63b86f84f9178013" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.814869 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89c7c9af772ad2e4421d8dbdefe3cac544149ae63151c98f63b86f84f9178013"} err="failed to get container status \"89c7c9af772ad2e4421d8dbdefe3cac544149ae63151c98f63b86f84f9178013\": rpc error: code = NotFound desc = could not find container \"89c7c9af772ad2e4421d8dbdefe3cac544149ae63151c98f63b86f84f9178013\": container with ID starting with 89c7c9af772ad2e4421d8dbdefe3cac544149ae63151c98f63b86f84f9178013 not found: ID does not exist" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.814894 4825 scope.go:117] "RemoveContainer" containerID="c58ab5ceae598ade1461e5a6b9d7a169835896448e2f08f75b0bfdce2bc4c124" Oct 07 19:18:48 crc kubenswrapper[4825]: E1007 19:18:48.815257 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c58ab5ceae598ade1461e5a6b9d7a169835896448e2f08f75b0bfdce2bc4c124\": container with ID starting with c58ab5ceae598ade1461e5a6b9d7a169835896448e2f08f75b0bfdce2bc4c124 not found: ID does not exist" containerID="c58ab5ceae598ade1461e5a6b9d7a169835896448e2f08f75b0bfdce2bc4c124" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.815298 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c58ab5ceae598ade1461e5a6b9d7a169835896448e2f08f75b0bfdce2bc4c124"} err="failed to get container status \"c58ab5ceae598ade1461e5a6b9d7a169835896448e2f08f75b0bfdce2bc4c124\": rpc error: code = NotFound desc = could not find container \"c58ab5ceae598ade1461e5a6b9d7a169835896448e2f08f75b0bfdce2bc4c124\": container with ID starting with c58ab5ceae598ade1461e5a6b9d7a169835896448e2f08f75b0bfdce2bc4c124 not found: ID does not exist" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.815325 4825 scope.go:117] "RemoveContainer" containerID="89c7c9af772ad2e4421d8dbdefe3cac544149ae63151c98f63b86f84f9178013" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.815599 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89c7c9af772ad2e4421d8dbdefe3cac544149ae63151c98f63b86f84f9178013"} err="failed to get container status \"89c7c9af772ad2e4421d8dbdefe3cac544149ae63151c98f63b86f84f9178013\": rpc error: code = NotFound desc = could not find container \"89c7c9af772ad2e4421d8dbdefe3cac544149ae63151c98f63b86f84f9178013\": container with ID starting with 89c7c9af772ad2e4421d8dbdefe3cac544149ae63151c98f63b86f84f9178013 not found: ID does not exist" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.815645 4825 scope.go:117] "RemoveContainer" containerID="c58ab5ceae598ade1461e5a6b9d7a169835896448e2f08f75b0bfdce2bc4c124" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.816296 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c58ab5ceae598ade1461e5a6b9d7a169835896448e2f08f75b0bfdce2bc4c124"} err="failed to get container status \"c58ab5ceae598ade1461e5a6b9d7a169835896448e2f08f75b0bfdce2bc4c124\": rpc error: code = NotFound desc = could not find container \"c58ab5ceae598ade1461e5a6b9d7a169835896448e2f08f75b0bfdce2bc4c124\": container with ID starting with c58ab5ceae598ade1461e5a6b9d7a169835896448e2f08f75b0bfdce2bc4c124 not found: ID does not exist" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.818210 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.848450 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.871349 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 19:18:48 crc kubenswrapper[4825]: E1007 19:18:48.871777 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89497435-87e0-4c79-9372-3ad2ae5c3161" containerName="dnsmasq-dns" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.871789 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="89497435-87e0-4c79-9372-3ad2ae5c3161" containerName="dnsmasq-dns" Oct 07 19:18:48 crc kubenswrapper[4825]: E1007 19:18:48.871810 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ec02c95-3434-4e6a-b6a9-713f0ef83dba" containerName="glance-log" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.871817 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ec02c95-3434-4e6a-b6a9-713f0ef83dba" containerName="glance-log" Oct 07 19:18:48 crc kubenswrapper[4825]: E1007 19:18:48.871831 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ec02c95-3434-4e6a-b6a9-713f0ef83dba" containerName="glance-httpd" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.871837 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ec02c95-3434-4e6a-b6a9-713f0ef83dba" containerName="glance-httpd" Oct 07 19:18:48 crc kubenswrapper[4825]: E1007 19:18:48.871860 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89497435-87e0-4c79-9372-3ad2ae5c3161" containerName="init" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.871866 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="89497435-87e0-4c79-9372-3ad2ae5c3161" containerName="init" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.872035 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ec02c95-3434-4e6a-b6a9-713f0ef83dba" containerName="glance-httpd" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.872064 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ec02c95-3434-4e6a-b6a9-713f0ef83dba" containerName="glance-log" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.872076 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="89497435-87e0-4c79-9372-3ad2ae5c3161" containerName="dnsmasq-dns" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.873745 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.876253 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.877577 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 07 19:18:48 crc kubenswrapper[4825]: I1007 19:18:48.886209 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.027452 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.027606 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-scripts\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.027647 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.027678 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-config-data\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.027800 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.027921 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.027963 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njdxs\" (UniqueName: \"kubernetes.io/projected/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-kube-api-access-njdxs\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.027996 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-logs\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.130347 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.130429 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njdxs\" (UniqueName: \"kubernetes.io/projected/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-kube-api-access-njdxs\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.130473 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-logs\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.130534 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.130554 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-scripts\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.130576 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.130618 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-config-data\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.130659 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.133076 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.133688 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-logs\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.133927 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.140533 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-config-data\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.144679 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.144688 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-scripts\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.145302 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.151096 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njdxs\" (UniqueName: \"kubernetes.io/projected/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-kube-api-access-njdxs\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.155838 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cccsz" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.162568 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.313242 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.332360 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjfwv\" (UniqueName: \"kubernetes.io/projected/019abaa0-c821-4f8c-a195-a9ea7bc81f8b-kube-api-access-bjfwv\") pod \"019abaa0-c821-4f8c-a195-a9ea7bc81f8b\" (UID: \"019abaa0-c821-4f8c-a195-a9ea7bc81f8b\") " Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.332406 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019abaa0-c821-4f8c-a195-a9ea7bc81f8b-combined-ca-bundle\") pod \"019abaa0-c821-4f8c-a195-a9ea7bc81f8b\" (UID: \"019abaa0-c821-4f8c-a195-a9ea7bc81f8b\") " Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.332430 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/019abaa0-c821-4f8c-a195-a9ea7bc81f8b-config\") pod \"019abaa0-c821-4f8c-a195-a9ea7bc81f8b\" (UID: \"019abaa0-c821-4f8c-a195-a9ea7bc81f8b\") " Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.340359 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/019abaa0-c821-4f8c-a195-a9ea7bc81f8b-kube-api-access-bjfwv" (OuterVolumeSpecName: "kube-api-access-bjfwv") pod "019abaa0-c821-4f8c-a195-a9ea7bc81f8b" (UID: "019abaa0-c821-4f8c-a195-a9ea7bc81f8b"). InnerVolumeSpecName "kube-api-access-bjfwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:18:49 crc kubenswrapper[4825]: E1007 19:18:49.356576 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/019abaa0-c821-4f8c-a195-a9ea7bc81f8b-config podName:019abaa0-c821-4f8c-a195-a9ea7bc81f8b nodeName:}" failed. No retries permitted until 2025-10-07 19:18:49.856500005 +0000 UTC m=+1118.678538652 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config" (UniqueName: "kubernetes.io/secret/019abaa0-c821-4f8c-a195-a9ea7bc81f8b-config") pod "019abaa0-c821-4f8c-a195-a9ea7bc81f8b" (UID: "019abaa0-c821-4f8c-a195-a9ea7bc81f8b") : error deleting /var/lib/kubelet/pods/019abaa0-c821-4f8c-a195-a9ea7bc81f8b/volume-subpaths: remove /var/lib/kubelet/pods/019abaa0-c821-4f8c-a195-a9ea7bc81f8b/volume-subpaths: no such file or directory Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.362318 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/019abaa0-c821-4f8c-a195-a9ea7bc81f8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "019abaa0-c821-4f8c-a195-a9ea7bc81f8b" (UID: "019abaa0-c821-4f8c-a195-a9ea7bc81f8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:49 crc kubenswrapper[4825]: E1007 19:18:49.372407 4825 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dbca3ac_3960_4572_93c4_04276137f96a.slice/crio-conmon-9303478f08a4b2c0ee54d55314c59e8480daf1fe22eea067e520f7bdce9d2beb.scope\": RecentStats: unable to find data in memory cache]" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.377314 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.436044 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjfwv\" (UniqueName: \"kubernetes.io/projected/019abaa0-c821-4f8c-a195-a9ea7bc81f8b-kube-api-access-bjfwv\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.436326 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019abaa0-c821-4f8c-a195-a9ea7bc81f8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.537536 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c0c4637-00a5-4fd1-8105-4b36770b00b6-scripts\") pod \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.537571 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c0c4637-00a5-4fd1-8105-4b36770b00b6-combined-ca-bundle\") pod \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.537616 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.537651 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmccg\" (UniqueName: \"kubernetes.io/projected/4c0c4637-00a5-4fd1-8105-4b36770b00b6-kube-api-access-jmccg\") pod \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.537718 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c0c4637-00a5-4fd1-8105-4b36770b00b6-config-data\") pod \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.537734 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c0c4637-00a5-4fd1-8105-4b36770b00b6-httpd-run\") pod \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.537774 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c0c4637-00a5-4fd1-8105-4b36770b00b6-logs\") pod \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\" (UID: \"4c0c4637-00a5-4fd1-8105-4b36770b00b6\") " Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.538493 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c0c4637-00a5-4fd1-8105-4b36770b00b6-logs" (OuterVolumeSpecName: "logs") pod "4c0c4637-00a5-4fd1-8105-4b36770b00b6" (UID: "4c0c4637-00a5-4fd1-8105-4b36770b00b6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.540069 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c0c4637-00a5-4fd1-8105-4b36770b00b6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4c0c4637-00a5-4fd1-8105-4b36770b00b6" (UID: "4c0c4637-00a5-4fd1-8105-4b36770b00b6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.543316 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "4c0c4637-00a5-4fd1-8105-4b36770b00b6" (UID: "4c0c4637-00a5-4fd1-8105-4b36770b00b6"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.544434 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c0c4637-00a5-4fd1-8105-4b36770b00b6-kube-api-access-jmccg" (OuterVolumeSpecName: "kube-api-access-jmccg") pod "4c0c4637-00a5-4fd1-8105-4b36770b00b6" (UID: "4c0c4637-00a5-4fd1-8105-4b36770b00b6"). InnerVolumeSpecName "kube-api-access-jmccg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.549391 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c0c4637-00a5-4fd1-8105-4b36770b00b6-scripts" (OuterVolumeSpecName: "scripts") pod "4c0c4637-00a5-4fd1-8105-4b36770b00b6" (UID: "4c0c4637-00a5-4fd1-8105-4b36770b00b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.573873 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c0c4637-00a5-4fd1-8105-4b36770b00b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c0c4637-00a5-4fd1-8105-4b36770b00b6" (UID: "4c0c4637-00a5-4fd1-8105-4b36770b00b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.598546 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c0c4637-00a5-4fd1-8105-4b36770b00b6-config-data" (OuterVolumeSpecName: "config-data") pod "4c0c4637-00a5-4fd1-8105-4b36770b00b6" (UID: "4c0c4637-00a5-4fd1-8105-4b36770b00b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.639530 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c0c4637-00a5-4fd1-8105-4b36770b00b6-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.639561 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c0c4637-00a5-4fd1-8105-4b36770b00b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.639585 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.639595 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmccg\" (UniqueName: \"kubernetes.io/projected/4c0c4637-00a5-4fd1-8105-4b36770b00b6-kube-api-access-jmccg\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.639604 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c0c4637-00a5-4fd1-8105-4b36770b00b6-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.639612 4825 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c0c4637-00a5-4fd1-8105-4b36770b00b6-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.639619 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c0c4637-00a5-4fd1-8105-4b36770b00b6-logs\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.657033 4825 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.740888 4825 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.763081 4825 generic.go:334] "Generic (PLEG): container finished" podID="4c0c4637-00a5-4fd1-8105-4b36770b00b6" containerID="c050b799dcd9529ec2cb1d88f2feafbcfb97f0f2a3bc760020f67f1099160474" exitCode=0 Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.763113 4825 generic.go:334] "Generic (PLEG): container finished" podID="4c0c4637-00a5-4fd1-8105-4b36770b00b6" containerID="9159cc4f1218a348b0fddfb9b8831d64e88457b45ce2b88ebeb5d91be51840a5" exitCode=143 Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.763375 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4c0c4637-00a5-4fd1-8105-4b36770b00b6","Type":"ContainerDied","Data":"c050b799dcd9529ec2cb1d88f2feafbcfb97f0f2a3bc760020f67f1099160474"} Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.763410 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4c0c4637-00a5-4fd1-8105-4b36770b00b6","Type":"ContainerDied","Data":"9159cc4f1218a348b0fddfb9b8831d64e88457b45ce2b88ebeb5d91be51840a5"} Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.763439 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4c0c4637-00a5-4fd1-8105-4b36770b00b6","Type":"ContainerDied","Data":"e5263e54a5f92be2cf6b45fb506a9e2086143a54d6ecffbf7ad73793b430cafe"} Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.763457 4825 scope.go:117] "RemoveContainer" containerID="c050b799dcd9529ec2cb1d88f2feafbcfb97f0f2a3bc760020f67f1099160474" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.763557 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.769290 4825 generic.go:334] "Generic (PLEG): container finished" podID="3dbca3ac-3960-4572-93c4-04276137f96a" containerID="9303478f08a4b2c0ee54d55314c59e8480daf1fe22eea067e520f7bdce9d2beb" exitCode=0 Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.769358 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tnkcb" event={"ID":"3dbca3ac-3960-4572-93c4-04276137f96a","Type":"ContainerDied","Data":"9303478f08a4b2c0ee54d55314c59e8480daf1fe22eea067e520f7bdce9d2beb"} Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.771242 4825 generic.go:334] "Generic (PLEG): container finished" podID="2d9e561b-95fb-4643-8452-01f9ae3475eb" containerID="b3983c1456971ed1e8946e43305af7caa014da109e1daed3fd8bc3d89db94d6b" exitCode=0 Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.771320 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q29ng" event={"ID":"2d9e561b-95fb-4643-8452-01f9ae3475eb","Type":"ContainerDied","Data":"b3983c1456971ed1e8946e43305af7caa014da109e1daed3fd8bc3d89db94d6b"} Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.783069 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cccsz" event={"ID":"019abaa0-c821-4f8c-a195-a9ea7bc81f8b","Type":"ContainerDied","Data":"bc1c44a04c1f57f3399d033b2349de96a60236c0a7729c90f6b8290d19baf628"} Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.783119 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc1c44a04c1f57f3399d033b2349de96a60236c0a7729c90f6b8290d19baf628" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.783211 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cccsz" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.814490 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ec02c95-3434-4e6a-b6a9-713f0ef83dba" path="/var/lib/kubelet/pods/8ec02c95-3434-4e6a-b6a9-713f0ef83dba/volumes" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.867583 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.881636 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.889784 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.895728 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 19:18:49 crc kubenswrapper[4825]: E1007 19:18:49.896205 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0c4637-00a5-4fd1-8105-4b36770b00b6" containerName="glance-log" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.896245 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0c4637-00a5-4fd1-8105-4b36770b00b6" containerName="glance-log" Oct 07 19:18:49 crc kubenswrapper[4825]: E1007 19:18:49.896277 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0c4637-00a5-4fd1-8105-4b36770b00b6" containerName="glance-httpd" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.896287 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0c4637-00a5-4fd1-8105-4b36770b00b6" containerName="glance-httpd" Oct 07 19:18:49 crc kubenswrapper[4825]: E1007 19:18:49.896307 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="019abaa0-c821-4f8c-a195-a9ea7bc81f8b" containerName="neutron-db-sync" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.896318 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="019abaa0-c821-4f8c-a195-a9ea7bc81f8b" containerName="neutron-db-sync" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.896518 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c0c4637-00a5-4fd1-8105-4b36770b00b6" containerName="glance-httpd" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.896552 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="019abaa0-c821-4f8c-a195-a9ea7bc81f8b" containerName="neutron-db-sync" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.896570 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c0c4637-00a5-4fd1-8105-4b36770b00b6" containerName="glance-log" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.897913 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.906474 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.906718 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.941480 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.946184 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/019abaa0-c821-4f8c-a195-a9ea7bc81f8b-config\") pod \"019abaa0-c821-4f8c-a195-a9ea7bc81f8b\" (UID: \"019abaa0-c821-4f8c-a195-a9ea7bc81f8b\") " Oct 07 19:18:49 crc kubenswrapper[4825]: I1007 19:18:49.959814 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/019abaa0-c821-4f8c-a195-a9ea7bc81f8b-config" (OuterVolumeSpecName: "config") pod "019abaa0-c821-4f8c-a195-a9ea7bc81f8b" (UID: "019abaa0-c821-4f8c-a195-a9ea7bc81f8b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.051788 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20980f0a-4148-4983-8991-ea4563cfbc5a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.051853 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlnlq\" (UniqueName: \"kubernetes.io/projected/20980f0a-4148-4983-8991-ea4563cfbc5a-kube-api-access-qlnlq\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.051876 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20980f0a-4148-4983-8991-ea4563cfbc5a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.051945 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20980f0a-4148-4983-8991-ea4563cfbc5a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.051967 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20980f0a-4148-4983-8991-ea4563cfbc5a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.052009 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.052034 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20980f0a-4148-4983-8991-ea4563cfbc5a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.052052 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20980f0a-4148-4983-8991-ea4563cfbc5a-logs\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.052106 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/019abaa0-c821-4f8c-a195-a9ea7bc81f8b-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.063851 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-5qdgx"] Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.064316 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" podUID="5fddc514-0ec2-4022-9971-75c8dd44ef6c" containerName="dnsmasq-dns" containerID="cri-o://100e248de1d6a802045068c3fbc0b33e4733b21ab1314fb1e65bb9023e48c456" gracePeriod=10 Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.085979 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-rgbxc"] Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.087645 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.115315 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-rgbxc"] Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.127353 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-d67bd544-4s2q8"] Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.128903 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d67bd544-4s2q8" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.134818 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.134907 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.134819 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-69wjq" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.134828 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.144783 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d67bd544-4s2q8"] Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.153614 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mqlb\" (UniqueName: \"kubernetes.io/projected/aac266a4-2c85-420b-b54c-7e9527761052-kube-api-access-9mqlb\") pod \"dnsmasq-dns-84b966f6c9-rgbxc\" (UID: \"aac266a4-2c85-420b-b54c-7e9527761052\") " pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.153817 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.153904 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20980f0a-4148-4983-8991-ea4563cfbc5a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.153958 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20980f0a-4148-4983-8991-ea4563cfbc5a-logs\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.153992 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-rgbxc\" (UID: \"aac266a4-2c85-420b-b54c-7e9527761052\") " pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.154135 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20980f0a-4148-4983-8991-ea4563cfbc5a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.154206 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlnlq\" (UniqueName: \"kubernetes.io/projected/20980f0a-4148-4983-8991-ea4563cfbc5a-kube-api-access-qlnlq\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.154273 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20980f0a-4148-4983-8991-ea4563cfbc5a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.154346 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-rgbxc\" (UID: \"aac266a4-2c85-420b-b54c-7e9527761052\") " pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.154388 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-rgbxc\" (UID: \"aac266a4-2c85-420b-b54c-7e9527761052\") " pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.154458 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-rgbxc\" (UID: \"aac266a4-2c85-420b-b54c-7e9527761052\") " pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.154490 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-config\") pod \"dnsmasq-dns-84b966f6c9-rgbxc\" (UID: \"aac266a4-2c85-420b-b54c-7e9527761052\") " pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.154514 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20980f0a-4148-4983-8991-ea4563cfbc5a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.154550 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20980f0a-4148-4983-8991-ea4563cfbc5a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.155100 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20980f0a-4148-4983-8991-ea4563cfbc5a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.155599 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.156083 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20980f0a-4148-4983-8991-ea4563cfbc5a-logs\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.168502 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20980f0a-4148-4983-8991-ea4563cfbc5a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.168732 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20980f0a-4148-4983-8991-ea4563cfbc5a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.170156 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20980f0a-4148-4983-8991-ea4563cfbc5a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.175056 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlnlq\" (UniqueName: \"kubernetes.io/projected/20980f0a-4148-4983-8991-ea4563cfbc5a-kube-api-access-qlnlq\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.181885 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.190520 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20980f0a-4148-4983-8991-ea4563cfbc5a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.256204 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mqlb\" (UniqueName: \"kubernetes.io/projected/aac266a4-2c85-420b-b54c-7e9527761052-kube-api-access-9mqlb\") pod \"dnsmasq-dns-84b966f6c9-rgbxc\" (UID: \"aac266a4-2c85-420b-b54c-7e9527761052\") " pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.256277 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-config\") pod \"neutron-d67bd544-4s2q8\" (UID: \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\") " pod="openstack/neutron-d67bd544-4s2q8" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.256303 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-974lp\" (UniqueName: \"kubernetes.io/projected/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-kube-api-access-974lp\") pod \"neutron-d67bd544-4s2q8\" (UID: \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\") " pod="openstack/neutron-d67bd544-4s2q8" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.256336 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-ovndb-tls-certs\") pod \"neutron-d67bd544-4s2q8\" (UID: \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\") " pod="openstack/neutron-d67bd544-4s2q8" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.256363 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-rgbxc\" (UID: \"aac266a4-2c85-420b-b54c-7e9527761052\") " pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.256423 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-combined-ca-bundle\") pod \"neutron-d67bd544-4s2q8\" (UID: \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\") " pod="openstack/neutron-d67bd544-4s2q8" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.256439 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-httpd-config\") pod \"neutron-d67bd544-4s2q8\" (UID: \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\") " pod="openstack/neutron-d67bd544-4s2q8" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.256504 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-rgbxc\" (UID: \"aac266a4-2c85-420b-b54c-7e9527761052\") " pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.256526 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-rgbxc\" (UID: \"aac266a4-2c85-420b-b54c-7e9527761052\") " pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.256569 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-rgbxc\" (UID: \"aac266a4-2c85-420b-b54c-7e9527761052\") " pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.256587 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-config\") pod \"dnsmasq-dns-84b966f6c9-rgbxc\" (UID: \"aac266a4-2c85-420b-b54c-7e9527761052\") " pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.257605 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-config\") pod \"dnsmasq-dns-84b966f6c9-rgbxc\" (UID: \"aac266a4-2c85-420b-b54c-7e9527761052\") " pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.258494 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-rgbxc\" (UID: \"aac266a4-2c85-420b-b54c-7e9527761052\") " pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.258990 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-rgbxc\" (UID: \"aac266a4-2c85-420b-b54c-7e9527761052\") " pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.259485 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-rgbxc\" (UID: \"aac266a4-2c85-420b-b54c-7e9527761052\") " pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.260035 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-rgbxc\" (UID: \"aac266a4-2c85-420b-b54c-7e9527761052\") " pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.297310 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mqlb\" (UniqueName: \"kubernetes.io/projected/aac266a4-2c85-420b-b54c-7e9527761052-kube-api-access-9mqlb\") pod \"dnsmasq-dns-84b966f6c9-rgbxc\" (UID: \"aac266a4-2c85-420b-b54c-7e9527761052\") " pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.297735 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.358423 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-config\") pod \"neutron-d67bd544-4s2q8\" (UID: \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\") " pod="openstack/neutron-d67bd544-4s2q8" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.358488 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-ovndb-tls-certs\") pod \"neutron-d67bd544-4s2q8\" (UID: \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\") " pod="openstack/neutron-d67bd544-4s2q8" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.358509 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-974lp\" (UniqueName: \"kubernetes.io/projected/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-kube-api-access-974lp\") pod \"neutron-d67bd544-4s2q8\" (UID: \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\") " pod="openstack/neutron-d67bd544-4s2q8" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.358571 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-combined-ca-bundle\") pod \"neutron-d67bd544-4s2q8\" (UID: \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\") " pod="openstack/neutron-d67bd544-4s2q8" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.358590 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-httpd-config\") pod \"neutron-d67bd544-4s2q8\" (UID: \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\") " pod="openstack/neutron-d67bd544-4s2q8" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.363984 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-httpd-config\") pod \"neutron-d67bd544-4s2q8\" (UID: \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\") " pod="openstack/neutron-d67bd544-4s2q8" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.364088 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-combined-ca-bundle\") pod \"neutron-d67bd544-4s2q8\" (UID: \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\") " pod="openstack/neutron-d67bd544-4s2q8" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.364648 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-ovndb-tls-certs\") pod \"neutron-d67bd544-4s2q8\" (UID: \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\") " pod="openstack/neutron-d67bd544-4s2q8" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.368147 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-config\") pod \"neutron-d67bd544-4s2q8\" (UID: \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\") " pod="openstack/neutron-d67bd544-4s2q8" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.378193 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-974lp\" (UniqueName: \"kubernetes.io/projected/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-kube-api-access-974lp\") pod \"neutron-d67bd544-4s2q8\" (UID: \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\") " pod="openstack/neutron-d67bd544-4s2q8" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.408950 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.410099 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.435599 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.480060 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.480119 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.631275 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d67bd544-4s2q8" Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.832327 4825 generic.go:334] "Generic (PLEG): container finished" podID="5fddc514-0ec2-4022-9971-75c8dd44ef6c" containerID="100e248de1d6a802045068c3fbc0b33e4733b21ab1314fb1e65bb9023e48c456" exitCode=0 Oct 07 19:18:50 crc kubenswrapper[4825]: I1007 19:18:50.832522 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" event={"ID":"5fddc514-0ec2-4022-9971-75c8dd44ef6c","Type":"ContainerDied","Data":"100e248de1d6a802045068c3fbc0b33e4733b21ab1314fb1e65bb9023e48c456"} Oct 07 19:18:51 crc kubenswrapper[4825]: I1007 19:18:51.807354 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c0c4637-00a5-4fd1-8105-4b36770b00b6" path="/var/lib/kubelet/pods/4c0c4637-00a5-4fd1-8105-4b36770b00b6/volumes" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.362376 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7d47b47d5-hc6q5"] Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.363682 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.367461 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.367598 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.387084 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7d47b47d5-hc6q5"] Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.508667 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e07eddca-def8-4a86-8d72-0c916ba6b6c1-internal-tls-certs\") pod \"neutron-7d47b47d5-hc6q5\" (UID: \"e07eddca-def8-4a86-8d72-0c916ba6b6c1\") " pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.508732 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e07eddca-def8-4a86-8d72-0c916ba6b6c1-combined-ca-bundle\") pod \"neutron-7d47b47d5-hc6q5\" (UID: \"e07eddca-def8-4a86-8d72-0c916ba6b6c1\") " pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.508788 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e07eddca-def8-4a86-8d72-0c916ba6b6c1-config\") pod \"neutron-7d47b47d5-hc6q5\" (UID: \"e07eddca-def8-4a86-8d72-0c916ba6b6c1\") " pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.508869 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvv8h\" (UniqueName: \"kubernetes.io/projected/e07eddca-def8-4a86-8d72-0c916ba6b6c1-kube-api-access-wvv8h\") pod \"neutron-7d47b47d5-hc6q5\" (UID: \"e07eddca-def8-4a86-8d72-0c916ba6b6c1\") " pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.508908 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e07eddca-def8-4a86-8d72-0c916ba6b6c1-httpd-config\") pod \"neutron-7d47b47d5-hc6q5\" (UID: \"e07eddca-def8-4a86-8d72-0c916ba6b6c1\") " pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.508924 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e07eddca-def8-4a86-8d72-0c916ba6b6c1-ovndb-tls-certs\") pod \"neutron-7d47b47d5-hc6q5\" (UID: \"e07eddca-def8-4a86-8d72-0c916ba6b6c1\") " pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.509015 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e07eddca-def8-4a86-8d72-0c916ba6b6c1-public-tls-certs\") pod \"neutron-7d47b47d5-hc6q5\" (UID: \"e07eddca-def8-4a86-8d72-0c916ba6b6c1\") " pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.611172 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e07eddca-def8-4a86-8d72-0c916ba6b6c1-combined-ca-bundle\") pod \"neutron-7d47b47d5-hc6q5\" (UID: \"e07eddca-def8-4a86-8d72-0c916ba6b6c1\") " pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.611259 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e07eddca-def8-4a86-8d72-0c916ba6b6c1-config\") pod \"neutron-7d47b47d5-hc6q5\" (UID: \"e07eddca-def8-4a86-8d72-0c916ba6b6c1\") " pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.611312 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvv8h\" (UniqueName: \"kubernetes.io/projected/e07eddca-def8-4a86-8d72-0c916ba6b6c1-kube-api-access-wvv8h\") pod \"neutron-7d47b47d5-hc6q5\" (UID: \"e07eddca-def8-4a86-8d72-0c916ba6b6c1\") " pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.611336 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e07eddca-def8-4a86-8d72-0c916ba6b6c1-ovndb-tls-certs\") pod \"neutron-7d47b47d5-hc6q5\" (UID: \"e07eddca-def8-4a86-8d72-0c916ba6b6c1\") " pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.611353 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e07eddca-def8-4a86-8d72-0c916ba6b6c1-httpd-config\") pod \"neutron-7d47b47d5-hc6q5\" (UID: \"e07eddca-def8-4a86-8d72-0c916ba6b6c1\") " pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.611416 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e07eddca-def8-4a86-8d72-0c916ba6b6c1-public-tls-certs\") pod \"neutron-7d47b47d5-hc6q5\" (UID: \"e07eddca-def8-4a86-8d72-0c916ba6b6c1\") " pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.611452 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e07eddca-def8-4a86-8d72-0c916ba6b6c1-internal-tls-certs\") pod \"neutron-7d47b47d5-hc6q5\" (UID: \"e07eddca-def8-4a86-8d72-0c916ba6b6c1\") " pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.617695 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e07eddca-def8-4a86-8d72-0c916ba6b6c1-public-tls-certs\") pod \"neutron-7d47b47d5-hc6q5\" (UID: \"e07eddca-def8-4a86-8d72-0c916ba6b6c1\") " pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.620116 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e07eddca-def8-4a86-8d72-0c916ba6b6c1-httpd-config\") pod \"neutron-7d47b47d5-hc6q5\" (UID: \"e07eddca-def8-4a86-8d72-0c916ba6b6c1\") " pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.620810 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e07eddca-def8-4a86-8d72-0c916ba6b6c1-ovndb-tls-certs\") pod \"neutron-7d47b47d5-hc6q5\" (UID: \"e07eddca-def8-4a86-8d72-0c916ba6b6c1\") " pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.621929 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e07eddca-def8-4a86-8d72-0c916ba6b6c1-config\") pod \"neutron-7d47b47d5-hc6q5\" (UID: \"e07eddca-def8-4a86-8d72-0c916ba6b6c1\") " pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.625310 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e07eddca-def8-4a86-8d72-0c916ba6b6c1-internal-tls-certs\") pod \"neutron-7d47b47d5-hc6q5\" (UID: \"e07eddca-def8-4a86-8d72-0c916ba6b6c1\") " pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.633252 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e07eddca-def8-4a86-8d72-0c916ba6b6c1-combined-ca-bundle\") pod \"neutron-7d47b47d5-hc6q5\" (UID: \"e07eddca-def8-4a86-8d72-0c916ba6b6c1\") " pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.635679 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvv8h\" (UniqueName: \"kubernetes.io/projected/e07eddca-def8-4a86-8d72-0c916ba6b6c1-kube-api-access-wvv8h\") pod \"neutron-7d47b47d5-hc6q5\" (UID: \"e07eddca-def8-4a86-8d72-0c916ba6b6c1\") " pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:18:52 crc kubenswrapper[4825]: I1007 19:18:52.686028 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:18:53 crc kubenswrapper[4825]: I1007 19:18:53.005436 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" podUID="5fddc514-0ec2-4022-9971-75c8dd44ef6c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.147:5353: connect: connection refused" Oct 07 19:18:53 crc kubenswrapper[4825]: W1007 19:18:53.969697 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05e1cdf5_2de6_438e_b0f1_b05a7c1a2779.slice/crio-c6955314d90e40469bf65a41f9d689ae831f8d71c765ac7f2f50190d048e3891 WatchSource:0}: Error finding container c6955314d90e40469bf65a41f9d689ae831f8d71c765ac7f2f50190d048e3891: Status 404 returned error can't find the container with id c6955314d90e40469bf65a41f9d689ae831f8d71c765ac7f2f50190d048e3891 Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.012843 4825 scope.go:117] "RemoveContainer" containerID="9159cc4f1218a348b0fddfb9b8831d64e88457b45ce2b88ebeb5d91be51840a5" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.065938 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8hldz" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.111226 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tnkcb" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.140684 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1abc8e94-8f1f-4195-b476-248206d004bf-scripts\") pod \"1abc8e94-8f1f-4195-b476-248206d004bf\" (UID: \"1abc8e94-8f1f-4195-b476-248206d004bf\") " Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.140758 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1abc8e94-8f1f-4195-b476-248206d004bf-logs\") pod \"1abc8e94-8f1f-4195-b476-248206d004bf\" (UID: \"1abc8e94-8f1f-4195-b476-248206d004bf\") " Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.140799 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1abc8e94-8f1f-4195-b476-248206d004bf-config-data\") pod \"1abc8e94-8f1f-4195-b476-248206d004bf\" (UID: \"1abc8e94-8f1f-4195-b476-248206d004bf\") " Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.140975 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv6s7\" (UniqueName: \"kubernetes.io/projected/1abc8e94-8f1f-4195-b476-248206d004bf-kube-api-access-zv6s7\") pod \"1abc8e94-8f1f-4195-b476-248206d004bf\" (UID: \"1abc8e94-8f1f-4195-b476-248206d004bf\") " Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.141000 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1abc8e94-8f1f-4195-b476-248206d004bf-combined-ca-bundle\") pod \"1abc8e94-8f1f-4195-b476-248206d004bf\" (UID: \"1abc8e94-8f1f-4195-b476-248206d004bf\") " Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.144631 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1abc8e94-8f1f-4195-b476-248206d004bf-logs" (OuterVolumeSpecName: "logs") pod "1abc8e94-8f1f-4195-b476-248206d004bf" (UID: "1abc8e94-8f1f-4195-b476-248206d004bf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.153380 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1abc8e94-8f1f-4195-b476-248206d004bf-scripts" (OuterVolumeSpecName: "scripts") pod "1abc8e94-8f1f-4195-b476-248206d004bf" (UID: "1abc8e94-8f1f-4195-b476-248206d004bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.153544 4825 scope.go:117] "RemoveContainer" containerID="c050b799dcd9529ec2cb1d88f2feafbcfb97f0f2a3bc760020f67f1099160474" Oct 07 19:18:54 crc kubenswrapper[4825]: E1007 19:18:54.155907 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c050b799dcd9529ec2cb1d88f2feafbcfb97f0f2a3bc760020f67f1099160474\": container with ID starting with c050b799dcd9529ec2cb1d88f2feafbcfb97f0f2a3bc760020f67f1099160474 not found: ID does not exist" containerID="c050b799dcd9529ec2cb1d88f2feafbcfb97f0f2a3bc760020f67f1099160474" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.155948 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c050b799dcd9529ec2cb1d88f2feafbcfb97f0f2a3bc760020f67f1099160474"} err="failed to get container status \"c050b799dcd9529ec2cb1d88f2feafbcfb97f0f2a3bc760020f67f1099160474\": rpc error: code = NotFound desc = could not find container \"c050b799dcd9529ec2cb1d88f2feafbcfb97f0f2a3bc760020f67f1099160474\": container with ID starting with c050b799dcd9529ec2cb1d88f2feafbcfb97f0f2a3bc760020f67f1099160474 not found: ID does not exist" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.155992 4825 scope.go:117] "RemoveContainer" containerID="9159cc4f1218a348b0fddfb9b8831d64e88457b45ce2b88ebeb5d91be51840a5" Oct 07 19:18:54 crc kubenswrapper[4825]: E1007 19:18:54.156522 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9159cc4f1218a348b0fddfb9b8831d64e88457b45ce2b88ebeb5d91be51840a5\": container with ID starting with 9159cc4f1218a348b0fddfb9b8831d64e88457b45ce2b88ebeb5d91be51840a5 not found: ID does not exist" containerID="9159cc4f1218a348b0fddfb9b8831d64e88457b45ce2b88ebeb5d91be51840a5" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.156570 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9159cc4f1218a348b0fddfb9b8831d64e88457b45ce2b88ebeb5d91be51840a5"} err="failed to get container status \"9159cc4f1218a348b0fddfb9b8831d64e88457b45ce2b88ebeb5d91be51840a5\": rpc error: code = NotFound desc = could not find container \"9159cc4f1218a348b0fddfb9b8831d64e88457b45ce2b88ebeb5d91be51840a5\": container with ID starting with 9159cc4f1218a348b0fddfb9b8831d64e88457b45ce2b88ebeb5d91be51840a5 not found: ID does not exist" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.156587 4825 scope.go:117] "RemoveContainer" containerID="c050b799dcd9529ec2cb1d88f2feafbcfb97f0f2a3bc760020f67f1099160474" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.158896 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c050b799dcd9529ec2cb1d88f2feafbcfb97f0f2a3bc760020f67f1099160474"} err="failed to get container status \"c050b799dcd9529ec2cb1d88f2feafbcfb97f0f2a3bc760020f67f1099160474\": rpc error: code = NotFound desc = could not find container \"c050b799dcd9529ec2cb1d88f2feafbcfb97f0f2a3bc760020f67f1099160474\": container with ID starting with c050b799dcd9529ec2cb1d88f2feafbcfb97f0f2a3bc760020f67f1099160474 not found: ID does not exist" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.158916 4825 scope.go:117] "RemoveContainer" containerID="9159cc4f1218a348b0fddfb9b8831d64e88457b45ce2b88ebeb5d91be51840a5" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.160060 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9159cc4f1218a348b0fddfb9b8831d64e88457b45ce2b88ebeb5d91be51840a5"} err="failed to get container status \"9159cc4f1218a348b0fddfb9b8831d64e88457b45ce2b88ebeb5d91be51840a5\": rpc error: code = NotFound desc = could not find container \"9159cc4f1218a348b0fddfb9b8831d64e88457b45ce2b88ebeb5d91be51840a5\": container with ID starting with 9159cc4f1218a348b0fddfb9b8831d64e88457b45ce2b88ebeb5d91be51840a5 not found: ID does not exist" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.169670 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1abc8e94-8f1f-4195-b476-248206d004bf-kube-api-access-zv6s7" (OuterVolumeSpecName: "kube-api-access-zv6s7") pod "1abc8e94-8f1f-4195-b476-248206d004bf" (UID: "1abc8e94-8f1f-4195-b476-248206d004bf"). InnerVolumeSpecName "kube-api-access-zv6s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.199744 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1abc8e94-8f1f-4195-b476-248206d004bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1abc8e94-8f1f-4195-b476-248206d004bf" (UID: "1abc8e94-8f1f-4195-b476-248206d004bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.221562 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1abc8e94-8f1f-4195-b476-248206d004bf-config-data" (OuterVolumeSpecName: "config-data") pod "1abc8e94-8f1f-4195-b476-248206d004bf" (UID: "1abc8e94-8f1f-4195-b476-248206d004bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.223799 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q29ng" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.242032 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3dbca3ac-3960-4572-93c4-04276137f96a-db-sync-config-data\") pod \"3dbca3ac-3960-4572-93c4-04276137f96a\" (UID: \"3dbca3ac-3960-4572-93c4-04276137f96a\") " Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.242118 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4ds8\" (UniqueName: \"kubernetes.io/projected/3dbca3ac-3960-4572-93c4-04276137f96a-kube-api-access-v4ds8\") pod \"3dbca3ac-3960-4572-93c4-04276137f96a\" (UID: \"3dbca3ac-3960-4572-93c4-04276137f96a\") " Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.242914 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dbca3ac-3960-4572-93c4-04276137f96a-combined-ca-bundle\") pod \"3dbca3ac-3960-4572-93c4-04276137f96a\" (UID: \"3dbca3ac-3960-4572-93c4-04276137f96a\") " Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.245887 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv6s7\" (UniqueName: \"kubernetes.io/projected/1abc8e94-8f1f-4195-b476-248206d004bf-kube-api-access-zv6s7\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.245914 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1abc8e94-8f1f-4195-b476-248206d004bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.245925 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1abc8e94-8f1f-4195-b476-248206d004bf-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.245935 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1abc8e94-8f1f-4195-b476-248206d004bf-logs\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.245962 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1abc8e94-8f1f-4195-b476-248206d004bf-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.247174 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dbca3ac-3960-4572-93c4-04276137f96a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3dbca3ac-3960-4572-93c4-04276137f96a" (UID: "3dbca3ac-3960-4572-93c4-04276137f96a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.247814 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dbca3ac-3960-4572-93c4-04276137f96a-kube-api-access-v4ds8" (OuterVolumeSpecName: "kube-api-access-v4ds8") pod "3dbca3ac-3960-4572-93c4-04276137f96a" (UID: "3dbca3ac-3960-4572-93c4-04276137f96a"). InnerVolumeSpecName "kube-api-access-v4ds8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.274923 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dbca3ac-3960-4572-93c4-04276137f96a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3dbca3ac-3960-4572-93c4-04276137f96a" (UID: "3dbca3ac-3960-4572-93c4-04276137f96a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.346633 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mftgv\" (UniqueName: \"kubernetes.io/projected/2d9e561b-95fb-4643-8452-01f9ae3475eb-kube-api-access-mftgv\") pod \"2d9e561b-95fb-4643-8452-01f9ae3475eb\" (UID: \"2d9e561b-95fb-4643-8452-01f9ae3475eb\") " Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.347268 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-config-data\") pod \"2d9e561b-95fb-4643-8452-01f9ae3475eb\" (UID: \"2d9e561b-95fb-4643-8452-01f9ae3475eb\") " Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.347401 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-credential-keys\") pod \"2d9e561b-95fb-4643-8452-01f9ae3475eb\" (UID: \"2d9e561b-95fb-4643-8452-01f9ae3475eb\") " Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.347444 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-combined-ca-bundle\") pod \"2d9e561b-95fb-4643-8452-01f9ae3475eb\" (UID: \"2d9e561b-95fb-4643-8452-01f9ae3475eb\") " Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.347466 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-scripts\") pod \"2d9e561b-95fb-4643-8452-01f9ae3475eb\" (UID: \"2d9e561b-95fb-4643-8452-01f9ae3475eb\") " Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.347689 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-fernet-keys\") pod \"2d9e561b-95fb-4643-8452-01f9ae3475eb\" (UID: \"2d9e561b-95fb-4643-8452-01f9ae3475eb\") " Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.348035 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dbca3ac-3960-4572-93c4-04276137f96a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.348052 4825 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3dbca3ac-3960-4572-93c4-04276137f96a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.348063 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4ds8\" (UniqueName: \"kubernetes.io/projected/3dbca3ac-3960-4572-93c4-04276137f96a-kube-api-access-v4ds8\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.351132 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2d9e561b-95fb-4643-8452-01f9ae3475eb" (UID: "2d9e561b-95fb-4643-8452-01f9ae3475eb"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.351160 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2d9e561b-95fb-4643-8452-01f9ae3475eb" (UID: "2d9e561b-95fb-4643-8452-01f9ae3475eb"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.353360 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d9e561b-95fb-4643-8452-01f9ae3475eb-kube-api-access-mftgv" (OuterVolumeSpecName: "kube-api-access-mftgv") pod "2d9e561b-95fb-4643-8452-01f9ae3475eb" (UID: "2d9e561b-95fb-4643-8452-01f9ae3475eb"). InnerVolumeSpecName "kube-api-access-mftgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.354256 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-scripts" (OuterVolumeSpecName: "scripts") pod "2d9e561b-95fb-4643-8452-01f9ae3475eb" (UID: "2d9e561b-95fb-4643-8452-01f9ae3475eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.359678 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.388404 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d9e561b-95fb-4643-8452-01f9ae3475eb" (UID: "2d9e561b-95fb-4643-8452-01f9ae3475eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.392815 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-config-data" (OuterVolumeSpecName: "config-data") pod "2d9e561b-95fb-4643-8452-01f9ae3475eb" (UID: "2d9e561b-95fb-4643-8452-01f9ae3475eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.448985 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-dns-swift-storage-0\") pod \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\" (UID: \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\") " Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.449053 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-ovsdbserver-sb\") pod \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\" (UID: \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\") " Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.449080 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-ovsdbserver-nb\") pod \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\" (UID: \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\") " Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.449153 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn9l2\" (UniqueName: \"kubernetes.io/projected/5fddc514-0ec2-4022-9971-75c8dd44ef6c-kube-api-access-gn9l2\") pod \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\" (UID: \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\") " Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.449250 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-config\") pod \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\" (UID: \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\") " Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.449283 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-dns-svc\") pod \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\" (UID: \"5fddc514-0ec2-4022-9971-75c8dd44ef6c\") " Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.449613 4825 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.449626 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mftgv\" (UniqueName: \"kubernetes.io/projected/2d9e561b-95fb-4643-8452-01f9ae3475eb-kube-api-access-mftgv\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.449635 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.449644 4825 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.449654 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.449662 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d9e561b-95fb-4643-8452-01f9ae3475eb-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.467346 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fddc514-0ec2-4022-9971-75c8dd44ef6c-kube-api-access-gn9l2" (OuterVolumeSpecName: "kube-api-access-gn9l2") pod "5fddc514-0ec2-4022-9971-75c8dd44ef6c" (UID: "5fddc514-0ec2-4022-9971-75c8dd44ef6c"). InnerVolumeSpecName "kube-api-access-gn9l2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.487648 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5fddc514-0ec2-4022-9971-75c8dd44ef6c" (UID: "5fddc514-0ec2-4022-9971-75c8dd44ef6c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.492753 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5fddc514-0ec2-4022-9971-75c8dd44ef6c" (UID: "5fddc514-0ec2-4022-9971-75c8dd44ef6c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.493308 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5fddc514-0ec2-4022-9971-75c8dd44ef6c" (UID: "5fddc514-0ec2-4022-9971-75c8dd44ef6c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.500806 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-config" (OuterVolumeSpecName: "config") pod "5fddc514-0ec2-4022-9971-75c8dd44ef6c" (UID: "5fddc514-0ec2-4022-9971-75c8dd44ef6c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.505062 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5fddc514-0ec2-4022-9971-75c8dd44ef6c" (UID: "5fddc514-0ec2-4022-9971-75c8dd44ef6c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.550755 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.550784 4825 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.550796 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.550805 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.550813 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn9l2\" (UniqueName: \"kubernetes.io/projected/5fddc514-0ec2-4022-9971-75c8dd44ef6c-kube-api-access-gn9l2\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.550822 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fddc514-0ec2-4022-9971-75c8dd44ef6c-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.678278 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-rgbxc"] Oct 07 19:18:54 crc kubenswrapper[4825]: W1007 19:18:54.683710 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaac266a4_2c85_420b_b54c_7e9527761052.slice/crio-aa2011be69d952e3f2bee8ff60b0351b3b042d8496b50ec0dae17f9b1914c066 WatchSource:0}: Error finding container aa2011be69d952e3f2bee8ff60b0351b3b042d8496b50ec0dae17f9b1914c066: Status 404 returned error can't find the container with id aa2011be69d952e3f2bee8ff60b0351b3b042d8496b50ec0dae17f9b1914c066 Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.740437 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7d47b47d5-hc6q5"] Oct 07 19:18:54 crc kubenswrapper[4825]: W1007 19:18:54.772249 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode07eddca_def8_4a86_8d72_0c916ba6b6c1.slice/crio-fd692d481681481ff60b2ce774ed3210d56a3854908019a8f5a690b0c9740d11 WatchSource:0}: Error finding container fd692d481681481ff60b2ce774ed3210d56a3854908019a8f5a690b0c9740d11: Status 404 returned error can't find the container with id fd692d481681481ff60b2ce774ed3210d56a3854908019a8f5a690b0c9740d11 Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.866099 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d47b47d5-hc6q5" event={"ID":"e07eddca-def8-4a86-8d72-0c916ba6b6c1","Type":"ContainerStarted","Data":"fd692d481681481ff60b2ce774ed3210d56a3854908019a8f5a690b0c9740d11"} Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.868027 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8hldz" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.868019 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8hldz" event={"ID":"1abc8e94-8f1f-4195-b476-248206d004bf","Type":"ContainerDied","Data":"ee434fdd336abb19037a78d6d8e7ab2939811a07512343010a3ee83448f2338e"} Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.868166 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee434fdd336abb19037a78d6d8e7ab2939811a07512343010a3ee83448f2338e" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.873612 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.874586 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tnkcb" event={"ID":"3dbca3ac-3960-4572-93c4-04276137f96a","Type":"ContainerDied","Data":"1b59aad5bc2b0e4b14c44b2fd8bd5bfac17ddd4ef1a28ac60cef9a12bcc0478c"} Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.874615 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b59aad5bc2b0e4b14c44b2fd8bd5bfac17ddd4ef1a28ac60cef9a12bcc0478c" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.874659 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tnkcb" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.879671 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779","Type":"ContainerStarted","Data":"c6955314d90e40469bf65a41f9d689ae831f8d71c765ac7f2f50190d048e3891"} Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.881646 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" event={"ID":"aac266a4-2c85-420b-b54c-7e9527761052","Type":"ContainerStarted","Data":"aa2011be69d952e3f2bee8ff60b0351b3b042d8496b50ec0dae17f9b1914c066"} Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.898210 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15","Type":"ContainerStarted","Data":"112c82a4d6940b91671a4ea3bc2dfa34faa8090b62c0f942517318ecf5fcd0a1"} Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.899781 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q29ng" event={"ID":"2d9e561b-95fb-4643-8452-01f9ae3475eb","Type":"ContainerDied","Data":"ce84fde6ad9bf08730193ea5e06fa45b5d4c2abfe6627591159ec96c77bfdebd"} Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.899801 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce84fde6ad9bf08730193ea5e06fa45b5d4c2abfe6627591159ec96c77bfdebd" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.899848 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q29ng" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.904747 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" event={"ID":"5fddc514-0ec2-4022-9971-75c8dd44ef6c","Type":"ContainerDied","Data":"7b5cfc253f6dbf1a78aafd87ad9eb87f1c51a98495a436946410d69532a950d6"} Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.904806 4825 scope.go:117] "RemoveContainer" containerID="100e248de1d6a802045068c3fbc0b33e4733b21ab1314fb1e65bb9023e48c456" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.904818 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-5qdgx" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.945462 4825 scope.go:117] "RemoveContainer" containerID="97c2f3ebed0c01295a9c8b616040c0b0432436d19cbaa6a6f706a195f3ea38b1" Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.954327 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-5qdgx"] Oct 07 19:18:54 crc kubenswrapper[4825]: I1007 19:18:54.967206 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-5qdgx"] Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.186102 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7f4b8c987b-kjdd8"] Oct 07 19:18:55 crc kubenswrapper[4825]: E1007 19:18:55.186726 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d9e561b-95fb-4643-8452-01f9ae3475eb" containerName="keystone-bootstrap" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.186741 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d9e561b-95fb-4643-8452-01f9ae3475eb" containerName="keystone-bootstrap" Oct 07 19:18:55 crc kubenswrapper[4825]: E1007 19:18:55.186765 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fddc514-0ec2-4022-9971-75c8dd44ef6c" containerName="dnsmasq-dns" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.186773 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fddc514-0ec2-4022-9971-75c8dd44ef6c" containerName="dnsmasq-dns" Oct 07 19:18:55 crc kubenswrapper[4825]: E1007 19:18:55.186800 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1abc8e94-8f1f-4195-b476-248206d004bf" containerName="placement-db-sync" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.186808 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1abc8e94-8f1f-4195-b476-248206d004bf" containerName="placement-db-sync" Oct 07 19:18:55 crc kubenswrapper[4825]: E1007 19:18:55.186822 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dbca3ac-3960-4572-93c4-04276137f96a" containerName="barbican-db-sync" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.186830 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dbca3ac-3960-4572-93c4-04276137f96a" containerName="barbican-db-sync" Oct 07 19:18:55 crc kubenswrapper[4825]: E1007 19:18:55.186844 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fddc514-0ec2-4022-9971-75c8dd44ef6c" containerName="init" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.186850 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fddc514-0ec2-4022-9971-75c8dd44ef6c" containerName="init" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.187124 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d9e561b-95fb-4643-8452-01f9ae3475eb" containerName="keystone-bootstrap" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.187148 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fddc514-0ec2-4022-9971-75c8dd44ef6c" containerName="dnsmasq-dns" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.187163 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="1abc8e94-8f1f-4195-b476-248206d004bf" containerName="placement-db-sync" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.187178 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dbca3ac-3960-4572-93c4-04276137f96a" containerName="barbican-db-sync" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.189418 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.310192 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7f4b8c987b-kjdd8"] Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.321120 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.321865 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.321907 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.321950 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.321983 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2qjzz" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.443778 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sznzb\" (UniqueName: \"kubernetes.io/projected/4297247b-64e7-4379-aa35-9e2bf6d2d5d5-kube-api-access-sznzb\") pod \"placement-7f4b8c987b-kjdd8\" (UID: \"4297247b-64e7-4379-aa35-9e2bf6d2d5d5\") " pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.443850 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4297247b-64e7-4379-aa35-9e2bf6d2d5d5-internal-tls-certs\") pod \"placement-7f4b8c987b-kjdd8\" (UID: \"4297247b-64e7-4379-aa35-9e2bf6d2d5d5\") " pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.443894 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4297247b-64e7-4379-aa35-9e2bf6d2d5d5-config-data\") pod \"placement-7f4b8c987b-kjdd8\" (UID: \"4297247b-64e7-4379-aa35-9e2bf6d2d5d5\") " pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.443914 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4297247b-64e7-4379-aa35-9e2bf6d2d5d5-public-tls-certs\") pod \"placement-7f4b8c987b-kjdd8\" (UID: \"4297247b-64e7-4379-aa35-9e2bf6d2d5d5\") " pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.443952 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4297247b-64e7-4379-aa35-9e2bf6d2d5d5-logs\") pod \"placement-7f4b8c987b-kjdd8\" (UID: \"4297247b-64e7-4379-aa35-9e2bf6d2d5d5\") " pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.443991 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4297247b-64e7-4379-aa35-9e2bf6d2d5d5-combined-ca-bundle\") pod \"placement-7f4b8c987b-kjdd8\" (UID: \"4297247b-64e7-4379-aa35-9e2bf6d2d5d5\") " pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.444031 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4297247b-64e7-4379-aa35-9e2bf6d2d5d5-scripts\") pod \"placement-7f4b8c987b-kjdd8\" (UID: \"4297247b-64e7-4379-aa35-9e2bf6d2d5d5\") " pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.482349 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6b848888b7-8bpk8"] Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.483544 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.506507 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.506694 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-w9rfd" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.506911 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.507094 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.507199 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.507319 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.512933 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6b848888b7-8bpk8"] Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.545515 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4297247b-64e7-4379-aa35-9e2bf6d2d5d5-scripts\") pod \"placement-7f4b8c987b-kjdd8\" (UID: \"4297247b-64e7-4379-aa35-9e2bf6d2d5d5\") " pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.545574 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sznzb\" (UniqueName: \"kubernetes.io/projected/4297247b-64e7-4379-aa35-9e2bf6d2d5d5-kube-api-access-sznzb\") pod \"placement-7f4b8c987b-kjdd8\" (UID: \"4297247b-64e7-4379-aa35-9e2bf6d2d5d5\") " pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.545614 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4297247b-64e7-4379-aa35-9e2bf6d2d5d5-internal-tls-certs\") pod \"placement-7f4b8c987b-kjdd8\" (UID: \"4297247b-64e7-4379-aa35-9e2bf6d2d5d5\") " pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.545655 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4297247b-64e7-4379-aa35-9e2bf6d2d5d5-config-data\") pod \"placement-7f4b8c987b-kjdd8\" (UID: \"4297247b-64e7-4379-aa35-9e2bf6d2d5d5\") " pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.545670 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4297247b-64e7-4379-aa35-9e2bf6d2d5d5-public-tls-certs\") pod \"placement-7f4b8c987b-kjdd8\" (UID: \"4297247b-64e7-4379-aa35-9e2bf6d2d5d5\") " pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.545700 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4297247b-64e7-4379-aa35-9e2bf6d2d5d5-logs\") pod \"placement-7f4b8c987b-kjdd8\" (UID: \"4297247b-64e7-4379-aa35-9e2bf6d2d5d5\") " pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.545734 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4297247b-64e7-4379-aa35-9e2bf6d2d5d5-combined-ca-bundle\") pod \"placement-7f4b8c987b-kjdd8\" (UID: \"4297247b-64e7-4379-aa35-9e2bf6d2d5d5\") " pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.554751 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4297247b-64e7-4379-aa35-9e2bf6d2d5d5-logs\") pod \"placement-7f4b8c987b-kjdd8\" (UID: \"4297247b-64e7-4379-aa35-9e2bf6d2d5d5\") " pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.557854 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4297247b-64e7-4379-aa35-9e2bf6d2d5d5-combined-ca-bundle\") pod \"placement-7f4b8c987b-kjdd8\" (UID: \"4297247b-64e7-4379-aa35-9e2bf6d2d5d5\") " pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.565016 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4297247b-64e7-4379-aa35-9e2bf6d2d5d5-config-data\") pod \"placement-7f4b8c987b-kjdd8\" (UID: \"4297247b-64e7-4379-aa35-9e2bf6d2d5d5\") " pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.566767 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4297247b-64e7-4379-aa35-9e2bf6d2d5d5-internal-tls-certs\") pod \"placement-7f4b8c987b-kjdd8\" (UID: \"4297247b-64e7-4379-aa35-9e2bf6d2d5d5\") " pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.572761 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4297247b-64e7-4379-aa35-9e2bf6d2d5d5-scripts\") pod \"placement-7f4b8c987b-kjdd8\" (UID: \"4297247b-64e7-4379-aa35-9e2bf6d2d5d5\") " pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.574755 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4297247b-64e7-4379-aa35-9e2bf6d2d5d5-public-tls-certs\") pod \"placement-7f4b8c987b-kjdd8\" (UID: \"4297247b-64e7-4379-aa35-9e2bf6d2d5d5\") " pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.588996 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sznzb\" (UniqueName: \"kubernetes.io/projected/4297247b-64e7-4379-aa35-9e2bf6d2d5d5-kube-api-access-sznzb\") pod \"placement-7f4b8c987b-kjdd8\" (UID: \"4297247b-64e7-4379-aa35-9e2bf6d2d5d5\") " pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.614293 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-664844745f-dvzxg"] Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.630907 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-664844745f-dvzxg" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.633330 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-n59v5" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.638404 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.638630 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.640140 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-594d76bc86-m9c6m"] Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.642033 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-594d76bc86-m9c6m" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.647728 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.648187 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.652802 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-664844745f-dvzxg"] Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.653609 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0cf82d8-414d-4486-9cef-be5b38e75745-config-data\") pod \"keystone-6b848888b7-8bpk8\" (UID: \"a0cf82d8-414d-4486-9cef-be5b38e75745\") " pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.653646 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m82b\" (UniqueName: \"kubernetes.io/projected/a0cf82d8-414d-4486-9cef-be5b38e75745-kube-api-access-5m82b\") pod \"keystone-6b848888b7-8bpk8\" (UID: \"a0cf82d8-414d-4486-9cef-be5b38e75745\") " pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.653695 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0cf82d8-414d-4486-9cef-be5b38e75745-combined-ca-bundle\") pod \"keystone-6b848888b7-8bpk8\" (UID: \"a0cf82d8-414d-4486-9cef-be5b38e75745\") " pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.653726 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0cf82d8-414d-4486-9cef-be5b38e75745-scripts\") pod \"keystone-6b848888b7-8bpk8\" (UID: \"a0cf82d8-414d-4486-9cef-be5b38e75745\") " pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.653756 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a0cf82d8-414d-4486-9cef-be5b38e75745-credential-keys\") pod \"keystone-6b848888b7-8bpk8\" (UID: \"a0cf82d8-414d-4486-9cef-be5b38e75745\") " pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.654623 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0cf82d8-414d-4486-9cef-be5b38e75745-public-tls-certs\") pod \"keystone-6b848888b7-8bpk8\" (UID: \"a0cf82d8-414d-4486-9cef-be5b38e75745\") " pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.654651 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0cf82d8-414d-4486-9cef-be5b38e75745-internal-tls-certs\") pod \"keystone-6b848888b7-8bpk8\" (UID: \"a0cf82d8-414d-4486-9cef-be5b38e75745\") " pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.654673 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a0cf82d8-414d-4486-9cef-be5b38e75745-fernet-keys\") pod \"keystone-6b848888b7-8bpk8\" (UID: \"a0cf82d8-414d-4486-9cef-be5b38e75745\") " pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.668119 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-594d76bc86-m9c6m"] Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.720081 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-rgbxc"] Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.757175 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea2e502f-f902-43be-989a-2f0ed4e3ae02-config-data-custom\") pod \"barbican-keystone-listener-594d76bc86-m9c6m\" (UID: \"ea2e502f-f902-43be-989a-2f0ed4e3ae02\") " pod="openstack/barbican-keystone-listener-594d76bc86-m9c6m" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.757257 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0cf82d8-414d-4486-9cef-be5b38e75745-scripts\") pod \"keystone-6b848888b7-8bpk8\" (UID: \"a0cf82d8-414d-4486-9cef-be5b38e75745\") " pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.757294 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a0cf82d8-414d-4486-9cef-be5b38e75745-credential-keys\") pod \"keystone-6b848888b7-8bpk8\" (UID: \"a0cf82d8-414d-4486-9cef-be5b38e75745\") " pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.757320 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56d9e279-5942-4a24-84db-5d7f8fcabcba-combined-ca-bundle\") pod \"barbican-worker-664844745f-dvzxg\" (UID: \"56d9e279-5942-4a24-84db-5d7f8fcabcba\") " pod="openstack/barbican-worker-664844745f-dvzxg" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.757349 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea2e502f-f902-43be-989a-2f0ed4e3ae02-combined-ca-bundle\") pod \"barbican-keystone-listener-594d76bc86-m9c6m\" (UID: \"ea2e502f-f902-43be-989a-2f0ed4e3ae02\") " pod="openstack/barbican-keystone-listener-594d76bc86-m9c6m" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.757367 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56d9e279-5942-4a24-84db-5d7f8fcabcba-config-data\") pod \"barbican-worker-664844745f-dvzxg\" (UID: \"56d9e279-5942-4a24-84db-5d7f8fcabcba\") " pod="openstack/barbican-worker-664844745f-dvzxg" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.757394 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6z7m\" (UniqueName: \"kubernetes.io/projected/ea2e502f-f902-43be-989a-2f0ed4e3ae02-kube-api-access-j6z7m\") pod \"barbican-keystone-listener-594d76bc86-m9c6m\" (UID: \"ea2e502f-f902-43be-989a-2f0ed4e3ae02\") " pod="openstack/barbican-keystone-listener-594d76bc86-m9c6m" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.757420 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2e502f-f902-43be-989a-2f0ed4e3ae02-config-data\") pod \"barbican-keystone-listener-594d76bc86-m9c6m\" (UID: \"ea2e502f-f902-43be-989a-2f0ed4e3ae02\") " pod="openstack/barbican-keystone-listener-594d76bc86-m9c6m" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.757462 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69znt\" (UniqueName: \"kubernetes.io/projected/56d9e279-5942-4a24-84db-5d7f8fcabcba-kube-api-access-69znt\") pod \"barbican-worker-664844745f-dvzxg\" (UID: \"56d9e279-5942-4a24-84db-5d7f8fcabcba\") " pod="openstack/barbican-worker-664844745f-dvzxg" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.757504 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0cf82d8-414d-4486-9cef-be5b38e75745-public-tls-certs\") pod \"keystone-6b848888b7-8bpk8\" (UID: \"a0cf82d8-414d-4486-9cef-be5b38e75745\") " pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.757637 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0cf82d8-414d-4486-9cef-be5b38e75745-internal-tls-certs\") pod \"keystone-6b848888b7-8bpk8\" (UID: \"a0cf82d8-414d-4486-9cef-be5b38e75745\") " pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.757697 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a0cf82d8-414d-4486-9cef-be5b38e75745-fernet-keys\") pod \"keystone-6b848888b7-8bpk8\" (UID: \"a0cf82d8-414d-4486-9cef-be5b38e75745\") " pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.757725 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56d9e279-5942-4a24-84db-5d7f8fcabcba-logs\") pod \"barbican-worker-664844745f-dvzxg\" (UID: \"56d9e279-5942-4a24-84db-5d7f8fcabcba\") " pod="openstack/barbican-worker-664844745f-dvzxg" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.757779 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0cf82d8-414d-4486-9cef-be5b38e75745-config-data\") pod \"keystone-6b848888b7-8bpk8\" (UID: \"a0cf82d8-414d-4486-9cef-be5b38e75745\") " pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.757812 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m82b\" (UniqueName: \"kubernetes.io/projected/a0cf82d8-414d-4486-9cef-be5b38e75745-kube-api-access-5m82b\") pod \"keystone-6b848888b7-8bpk8\" (UID: \"a0cf82d8-414d-4486-9cef-be5b38e75745\") " pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.757898 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56d9e279-5942-4a24-84db-5d7f8fcabcba-config-data-custom\") pod \"barbican-worker-664844745f-dvzxg\" (UID: \"56d9e279-5942-4a24-84db-5d7f8fcabcba\") " pod="openstack/barbican-worker-664844745f-dvzxg" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.758004 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea2e502f-f902-43be-989a-2f0ed4e3ae02-logs\") pod \"barbican-keystone-listener-594d76bc86-m9c6m\" (UID: \"ea2e502f-f902-43be-989a-2f0ed4e3ae02\") " pod="openstack/barbican-keystone-listener-594d76bc86-m9c6m" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.758030 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0cf82d8-414d-4486-9cef-be5b38e75745-combined-ca-bundle\") pod \"keystone-6b848888b7-8bpk8\" (UID: \"a0cf82d8-414d-4486-9cef-be5b38e75745\") " pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.764511 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0cf82d8-414d-4486-9cef-be5b38e75745-public-tls-certs\") pod \"keystone-6b848888b7-8bpk8\" (UID: \"a0cf82d8-414d-4486-9cef-be5b38e75745\") " pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.766965 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d67bd544-4s2q8"] Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.774427 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0cf82d8-414d-4486-9cef-be5b38e75745-combined-ca-bundle\") pod \"keystone-6b848888b7-8bpk8\" (UID: \"a0cf82d8-414d-4486-9cef-be5b38e75745\") " pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.775422 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a0cf82d8-414d-4486-9cef-be5b38e75745-credential-keys\") pod \"keystone-6b848888b7-8bpk8\" (UID: \"a0cf82d8-414d-4486-9cef-be5b38e75745\") " pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.775589 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0cf82d8-414d-4486-9cef-be5b38e75745-scripts\") pod \"keystone-6b848888b7-8bpk8\" (UID: \"a0cf82d8-414d-4486-9cef-be5b38e75745\") " pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.783918 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0cf82d8-414d-4486-9cef-be5b38e75745-config-data\") pod \"keystone-6b848888b7-8bpk8\" (UID: \"a0cf82d8-414d-4486-9cef-be5b38e75745\") " pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.785571 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-xfgfj"] Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.793728 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a0cf82d8-414d-4486-9cef-be5b38e75745-fernet-keys\") pod \"keystone-6b848888b7-8bpk8\" (UID: \"a0cf82d8-414d-4486-9cef-be5b38e75745\") " pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.798030 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0cf82d8-414d-4486-9cef-be5b38e75745-internal-tls-certs\") pod \"keystone-6b848888b7-8bpk8\" (UID: \"a0cf82d8-414d-4486-9cef-be5b38e75745\") " pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.812023 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m82b\" (UniqueName: \"kubernetes.io/projected/a0cf82d8-414d-4486-9cef-be5b38e75745-kube-api-access-5m82b\") pod \"keystone-6b848888b7-8bpk8\" (UID: \"a0cf82d8-414d-4486-9cef-be5b38e75745\") " pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.812418 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.824977 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fddc514-0ec2-4022-9971-75c8dd44ef6c" path="/var/lib/kubelet/pods/5fddc514-0ec2-4022-9971-75c8dd44ef6c/volumes" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.825668 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-578479f95d-ht4w8"] Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.828247 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-578479f95d-ht4w8"] Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.828345 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-578479f95d-ht4w8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.831932 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.834331 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-xfgfj"] Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.844771 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.866346 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69znt\" (UniqueName: \"kubernetes.io/projected/56d9e279-5942-4a24-84db-5d7f8fcabcba-kube-api-access-69znt\") pod \"barbican-worker-664844745f-dvzxg\" (UID: \"56d9e279-5942-4a24-84db-5d7f8fcabcba\") " pod="openstack/barbican-worker-664844745f-dvzxg" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.866412 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56d9e279-5942-4a24-84db-5d7f8fcabcba-logs\") pod \"barbican-worker-664844745f-dvzxg\" (UID: \"56d9e279-5942-4a24-84db-5d7f8fcabcba\") " pod="openstack/barbican-worker-664844745f-dvzxg" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.866456 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56d9e279-5942-4a24-84db-5d7f8fcabcba-config-data-custom\") pod \"barbican-worker-664844745f-dvzxg\" (UID: \"56d9e279-5942-4a24-84db-5d7f8fcabcba\") " pod="openstack/barbican-worker-664844745f-dvzxg" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.866482 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea2e502f-f902-43be-989a-2f0ed4e3ae02-logs\") pod \"barbican-keystone-listener-594d76bc86-m9c6m\" (UID: \"ea2e502f-f902-43be-989a-2f0ed4e3ae02\") " pod="openstack/barbican-keystone-listener-594d76bc86-m9c6m" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.866508 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea2e502f-f902-43be-989a-2f0ed4e3ae02-config-data-custom\") pod \"barbican-keystone-listener-594d76bc86-m9c6m\" (UID: \"ea2e502f-f902-43be-989a-2f0ed4e3ae02\") " pod="openstack/barbican-keystone-listener-594d76bc86-m9c6m" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.866552 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56d9e279-5942-4a24-84db-5d7f8fcabcba-combined-ca-bundle\") pod \"barbican-worker-664844745f-dvzxg\" (UID: \"56d9e279-5942-4a24-84db-5d7f8fcabcba\") " pod="openstack/barbican-worker-664844745f-dvzxg" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.866576 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea2e502f-f902-43be-989a-2f0ed4e3ae02-combined-ca-bundle\") pod \"barbican-keystone-listener-594d76bc86-m9c6m\" (UID: \"ea2e502f-f902-43be-989a-2f0ed4e3ae02\") " pod="openstack/barbican-keystone-listener-594d76bc86-m9c6m" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.866591 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56d9e279-5942-4a24-84db-5d7f8fcabcba-config-data\") pod \"barbican-worker-664844745f-dvzxg\" (UID: \"56d9e279-5942-4a24-84db-5d7f8fcabcba\") " pod="openstack/barbican-worker-664844745f-dvzxg" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.866611 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6z7m\" (UniqueName: \"kubernetes.io/projected/ea2e502f-f902-43be-989a-2f0ed4e3ae02-kube-api-access-j6z7m\") pod \"barbican-keystone-listener-594d76bc86-m9c6m\" (UID: \"ea2e502f-f902-43be-989a-2f0ed4e3ae02\") " pod="openstack/barbican-keystone-listener-594d76bc86-m9c6m" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.866628 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2e502f-f902-43be-989a-2f0ed4e3ae02-config-data\") pod \"barbican-keystone-listener-594d76bc86-m9c6m\" (UID: \"ea2e502f-f902-43be-989a-2f0ed4e3ae02\") " pod="openstack/barbican-keystone-listener-594d76bc86-m9c6m" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.874338 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56d9e279-5942-4a24-84db-5d7f8fcabcba-logs\") pod \"barbican-worker-664844745f-dvzxg\" (UID: \"56d9e279-5942-4a24-84db-5d7f8fcabcba\") " pod="openstack/barbican-worker-664844745f-dvzxg" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.877652 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56d9e279-5942-4a24-84db-5d7f8fcabcba-combined-ca-bundle\") pod \"barbican-worker-664844745f-dvzxg\" (UID: \"56d9e279-5942-4a24-84db-5d7f8fcabcba\") " pod="openstack/barbican-worker-664844745f-dvzxg" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.878428 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea2e502f-f902-43be-989a-2f0ed4e3ae02-logs\") pod \"barbican-keystone-listener-594d76bc86-m9c6m\" (UID: \"ea2e502f-f902-43be-989a-2f0ed4e3ae02\") " pod="openstack/barbican-keystone-listener-594d76bc86-m9c6m" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.881847 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56d9e279-5942-4a24-84db-5d7f8fcabcba-config-data-custom\") pod \"barbican-worker-664844745f-dvzxg\" (UID: \"56d9e279-5942-4a24-84db-5d7f8fcabcba\") " pod="openstack/barbican-worker-664844745f-dvzxg" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.882447 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2e502f-f902-43be-989a-2f0ed4e3ae02-config-data\") pod \"barbican-keystone-listener-594d76bc86-m9c6m\" (UID: \"ea2e502f-f902-43be-989a-2f0ed4e3ae02\") " pod="openstack/barbican-keystone-listener-594d76bc86-m9c6m" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.882802 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56d9e279-5942-4a24-84db-5d7f8fcabcba-config-data\") pod \"barbican-worker-664844745f-dvzxg\" (UID: \"56d9e279-5942-4a24-84db-5d7f8fcabcba\") " pod="openstack/barbican-worker-664844745f-dvzxg" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.899564 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69znt\" (UniqueName: \"kubernetes.io/projected/56d9e279-5942-4a24-84db-5d7f8fcabcba-kube-api-access-69znt\") pod \"barbican-worker-664844745f-dvzxg\" (UID: \"56d9e279-5942-4a24-84db-5d7f8fcabcba\") " pod="openstack/barbican-worker-664844745f-dvzxg" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.914629 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6z7m\" (UniqueName: \"kubernetes.io/projected/ea2e502f-f902-43be-989a-2f0ed4e3ae02-kube-api-access-j6z7m\") pod \"barbican-keystone-listener-594d76bc86-m9c6m\" (UID: \"ea2e502f-f902-43be-989a-2f0ed4e3ae02\") " pod="openstack/barbican-keystone-listener-594d76bc86-m9c6m" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.917735 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea2e502f-f902-43be-989a-2f0ed4e3ae02-combined-ca-bundle\") pod \"barbican-keystone-listener-594d76bc86-m9c6m\" (UID: \"ea2e502f-f902-43be-989a-2f0ed4e3ae02\") " pod="openstack/barbican-keystone-listener-594d76bc86-m9c6m" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.946560 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"20980f0a-4148-4983-8991-ea4563cfbc5a","Type":"ContainerStarted","Data":"d07a43b2fd4e4ac4aa3f7ae6ea7ab2106945c03180672d8d8bd130188e5beed7"} Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.974574 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db17e9a-ec51-4c59-ad0d-0835b90c8231-combined-ca-bundle\") pod \"barbican-api-578479f95d-ht4w8\" (UID: \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\") " pod="openstack/barbican-api-578479f95d-ht4w8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.974709 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-xfgfj\" (UID: \"ce0b553c-e9d5-4613-a202-65fc185f60b4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.974742 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmb6c\" (UniqueName: \"kubernetes.io/projected/6db17e9a-ec51-4c59-ad0d-0835b90c8231-kube-api-access-tmb6c\") pod \"barbican-api-578479f95d-ht4w8\" (UID: \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\") " pod="openstack/barbican-api-578479f95d-ht4w8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.974775 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6db17e9a-ec51-4c59-ad0d-0835b90c8231-logs\") pod \"barbican-api-578479f95d-ht4w8\" (UID: \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\") " pod="openstack/barbican-api-578479f95d-ht4w8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.974797 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-xfgfj\" (UID: \"ce0b553c-e9d5-4613-a202-65fc185f60b4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.974824 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6db17e9a-ec51-4c59-ad0d-0835b90c8231-config-data-custom\") pod \"barbican-api-578479f95d-ht4w8\" (UID: \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\") " pod="openstack/barbican-api-578479f95d-ht4w8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.974861 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-config\") pod \"dnsmasq-dns-75c8ddd69c-xfgfj\" (UID: \"ce0b553c-e9d5-4613-a202-65fc185f60b4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.974884 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db17e9a-ec51-4c59-ad0d-0835b90c8231-config-data\") pod \"barbican-api-578479f95d-ht4w8\" (UID: \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\") " pod="openstack/barbican-api-578479f95d-ht4w8" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.974906 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-xfgfj\" (UID: \"ce0b553c-e9d5-4613-a202-65fc185f60b4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.974949 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wzxf\" (UniqueName: \"kubernetes.io/projected/ce0b553c-e9d5-4613-a202-65fc185f60b4-kube-api-access-2wzxf\") pod \"dnsmasq-dns-75c8ddd69c-xfgfj\" (UID: \"ce0b553c-e9d5-4613-a202-65fc185f60b4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.975039 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-xfgfj\" (UID: \"ce0b553c-e9d5-4613-a202-65fc185f60b4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.979992 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-664844745f-dvzxg" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.996558 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea2e502f-f902-43be-989a-2f0ed4e3ae02-config-data-custom\") pod \"barbican-keystone-listener-594d76bc86-m9c6m\" (UID: \"ea2e502f-f902-43be-989a-2f0ed4e3ae02\") " pod="openstack/barbican-keystone-listener-594d76bc86-m9c6m" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.996963 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-594d76bc86-m9c6m" Oct 07 19:18:55 crc kubenswrapper[4825]: I1007 19:18:55.999462 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779","Type":"ContainerStarted","Data":"26500b8f142625bd8a7d7547a151da17091c720ad58d1b05f0226cafe735907c"} Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.003580 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d67bd544-4s2q8" event={"ID":"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e","Type":"ContainerStarted","Data":"82c3dd9b7a5795f4cb1ff430c8d976cb273cc75df29b1e31c7c9da0f13f3c9cc"} Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.007155 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d47b47d5-hc6q5" event={"ID":"e07eddca-def8-4a86-8d72-0c916ba6b6c1","Type":"ContainerStarted","Data":"31fff61b843158b33407840f1c8da5a717724520ea0d6b6153aab2d8ad243b5f"} Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.007206 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d47b47d5-hc6q5" event={"ID":"e07eddca-def8-4a86-8d72-0c916ba6b6c1","Type":"ContainerStarted","Data":"ab83cf1b98b23b3770d95435dd4b86ccff8832336891461b6e70745780f45d90"} Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.007278 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.057596 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7d47b47d5-hc6q5" podStartSLOduration=4.057575558 podStartE2EDuration="4.057575558s" podCreationTimestamp="2025-10-07 19:18:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:18:56.054216282 +0000 UTC m=+1124.876254919" watchObservedRunningTime="2025-10-07 19:18:56.057575558 +0000 UTC m=+1124.879614185" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.064407 4825 generic.go:334] "Generic (PLEG): container finished" podID="aac266a4-2c85-420b-b54c-7e9527761052" containerID="44805a0ca7fc90a7633de2a67370356c7ce7d5627f34d33fddf8783a678c0c4e" exitCode=0 Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.064460 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" event={"ID":"aac266a4-2c85-420b-b54c-7e9527761052","Type":"ContainerDied","Data":"44805a0ca7fc90a7633de2a67370356c7ce7d5627f34d33fddf8783a678c0c4e"} Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.077635 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-xfgfj\" (UID: \"ce0b553c-e9d5-4613-a202-65fc185f60b4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.077699 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db17e9a-ec51-4c59-ad0d-0835b90c8231-combined-ca-bundle\") pod \"barbican-api-578479f95d-ht4w8\" (UID: \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\") " pod="openstack/barbican-api-578479f95d-ht4w8" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.077753 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-xfgfj\" (UID: \"ce0b553c-e9d5-4613-a202-65fc185f60b4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.077784 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmb6c\" (UniqueName: \"kubernetes.io/projected/6db17e9a-ec51-4c59-ad0d-0835b90c8231-kube-api-access-tmb6c\") pod \"barbican-api-578479f95d-ht4w8\" (UID: \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\") " pod="openstack/barbican-api-578479f95d-ht4w8" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.077802 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6db17e9a-ec51-4c59-ad0d-0835b90c8231-logs\") pod \"barbican-api-578479f95d-ht4w8\" (UID: \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\") " pod="openstack/barbican-api-578479f95d-ht4w8" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.077824 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-xfgfj\" (UID: \"ce0b553c-e9d5-4613-a202-65fc185f60b4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.077843 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6db17e9a-ec51-4c59-ad0d-0835b90c8231-config-data-custom\") pod \"barbican-api-578479f95d-ht4w8\" (UID: \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\") " pod="openstack/barbican-api-578479f95d-ht4w8" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.077869 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-config\") pod \"dnsmasq-dns-75c8ddd69c-xfgfj\" (UID: \"ce0b553c-e9d5-4613-a202-65fc185f60b4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.077890 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db17e9a-ec51-4c59-ad0d-0835b90c8231-config-data\") pod \"barbican-api-578479f95d-ht4w8\" (UID: \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\") " pod="openstack/barbican-api-578479f95d-ht4w8" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.077907 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-xfgfj\" (UID: \"ce0b553c-e9d5-4613-a202-65fc185f60b4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.077927 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wzxf\" (UniqueName: \"kubernetes.io/projected/ce0b553c-e9d5-4613-a202-65fc185f60b4-kube-api-access-2wzxf\") pod \"dnsmasq-dns-75c8ddd69c-xfgfj\" (UID: \"ce0b553c-e9d5-4613-a202-65fc185f60b4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.079010 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-xfgfj\" (UID: \"ce0b553c-e9d5-4613-a202-65fc185f60b4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.080437 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-xfgfj\" (UID: \"ce0b553c-e9d5-4613-a202-65fc185f60b4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.080959 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-xfgfj\" (UID: \"ce0b553c-e9d5-4613-a202-65fc185f60b4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.081307 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6db17e9a-ec51-4c59-ad0d-0835b90c8231-logs\") pod \"barbican-api-578479f95d-ht4w8\" (UID: \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\") " pod="openstack/barbican-api-578479f95d-ht4w8" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.081536 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-config\") pod \"dnsmasq-dns-75c8ddd69c-xfgfj\" (UID: \"ce0b553c-e9d5-4613-a202-65fc185f60b4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.081946 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-xfgfj\" (UID: \"ce0b553c-e9d5-4613-a202-65fc185f60b4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.107115 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmb6c\" (UniqueName: \"kubernetes.io/projected/6db17e9a-ec51-4c59-ad0d-0835b90c8231-kube-api-access-tmb6c\") pod \"barbican-api-578479f95d-ht4w8\" (UID: \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\") " pod="openstack/barbican-api-578479f95d-ht4w8" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.124097 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6db17e9a-ec51-4c59-ad0d-0835b90c8231-config-data-custom\") pod \"barbican-api-578479f95d-ht4w8\" (UID: \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\") " pod="openstack/barbican-api-578479f95d-ht4w8" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.124111 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wzxf\" (UniqueName: \"kubernetes.io/projected/ce0b553c-e9d5-4613-a202-65fc185f60b4-kube-api-access-2wzxf\") pod \"dnsmasq-dns-75c8ddd69c-xfgfj\" (UID: \"ce0b553c-e9d5-4613-a202-65fc185f60b4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.124122 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db17e9a-ec51-4c59-ad0d-0835b90c8231-combined-ca-bundle\") pod \"barbican-api-578479f95d-ht4w8\" (UID: \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\") " pod="openstack/barbican-api-578479f95d-ht4w8" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.125170 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db17e9a-ec51-4c59-ad0d-0835b90c8231-config-data\") pod \"barbican-api-578479f95d-ht4w8\" (UID: \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\") " pod="openstack/barbican-api-578479f95d-ht4w8" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.152698 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.192572 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-578479f95d-ht4w8" Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.396829 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7f4b8c987b-kjdd8"] Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.557341 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6b848888b7-8bpk8"] Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.764522 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-664844745f-dvzxg"] Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.895991 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-594d76bc86-m9c6m"] Oct 07 19:18:56 crc kubenswrapper[4825]: I1007 19:18:56.935951 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-578479f95d-ht4w8"] Oct 07 19:18:57 crc kubenswrapper[4825]: I1007 19:18:57.039056 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-xfgfj"] Oct 07 19:18:57 crc kubenswrapper[4825]: I1007 19:18:57.077106 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f4b8c987b-kjdd8" event={"ID":"4297247b-64e7-4379-aa35-9e2bf6d2d5d5","Type":"ContainerStarted","Data":"d4362dfec85148c7aa186cc094a62e563493050228ea245fc543028f671e8a8e"} Oct 07 19:18:57 crc kubenswrapper[4825]: I1007 19:18:57.079793 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"20980f0a-4148-4983-8991-ea4563cfbc5a","Type":"ContainerStarted","Data":"fbacb8c9a04b1fa2222ab53c98163452b1e70381c82894595819fc1118db2b62"} Oct 07 19:18:57 crc kubenswrapper[4825]: W1007 19:18:57.277360 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56d9e279_5942_4a24_84db_5d7f8fcabcba.slice/crio-0ec60f88df5449c5db112899ad63d9996659768fbe87f74b291ec748896f9322 WatchSource:0}: Error finding container 0ec60f88df5449c5db112899ad63d9996659768fbe87f74b291ec748896f9322: Status 404 returned error can't find the container with id 0ec60f88df5449c5db112899ad63d9996659768fbe87f74b291ec748896f9322 Oct 07 19:18:57 crc kubenswrapper[4825]: W1007 19:18:57.281142 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea2e502f_f902_43be_989a_2f0ed4e3ae02.slice/crio-69420b6bc5967e8a9c94a09744ab8a77c61b36398ba206f7f0389a4e1bc6a426 WatchSource:0}: Error finding container 69420b6bc5967e8a9c94a09744ab8a77c61b36398ba206f7f0389a4e1bc6a426: Status 404 returned error can't find the container with id 69420b6bc5967e8a9c94a09744ab8a77c61b36398ba206f7f0389a4e1bc6a426 Oct 07 19:18:57 crc kubenswrapper[4825]: W1007 19:18:57.285031 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6db17e9a_ec51_4c59_ad0d_0835b90c8231.slice/crio-c77739ccf3876013f8877d9d5d3a501349ffcfb5fc6c5b55ab84b235aa541c03 WatchSource:0}: Error finding container c77739ccf3876013f8877d9d5d3a501349ffcfb5fc6c5b55ab84b235aa541c03: Status 404 returned error can't find the container with id c77739ccf3876013f8877d9d5d3a501349ffcfb5fc6c5b55ab84b235aa541c03 Oct 07 19:18:57 crc kubenswrapper[4825]: E1007 19:18:57.308414 4825 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 07 19:18:57 crc kubenswrapper[4825]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/aac266a4-2c85-420b-b54c-7e9527761052/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 07 19:18:57 crc kubenswrapper[4825]: > podSandboxID="aa2011be69d952e3f2bee8ff60b0351b3b042d8496b50ec0dae17f9b1914c066" Oct 07 19:18:57 crc kubenswrapper[4825]: E1007 19:18:57.308881 4825 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 07 19:18:57 crc kubenswrapper[4825]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n574hdbh5ddhffh5c4h5dbh5bfh8bh75h584h97h557h69h55fhbch66dh94h5fdh9fh56fh584h5bbhb5h75hcbh667h77h577h5bhf9hf4hb4q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9mqlb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-84b966f6c9-rgbxc_openstack(aac266a4-2c85-420b-b54c-7e9527761052): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/aac266a4-2c85-420b-b54c-7e9527761052/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 07 19:18:57 crc kubenswrapper[4825]: > logger="UnhandledError" Oct 07 19:18:57 crc kubenswrapper[4825]: E1007 19:18:57.310306 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/aac266a4-2c85-420b-b54c-7e9527761052/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" podUID="aac266a4-2c85-420b-b54c-7e9527761052" Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.124134 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-578479f95d-ht4w8" event={"ID":"6db17e9a-ec51-4c59-ad0d-0835b90c8231","Type":"ContainerStarted","Data":"348a031751fa888be540da3478681d04b11c301a45d453dc05e08fa79bd63773"} Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.124428 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-578479f95d-ht4w8" event={"ID":"6db17e9a-ec51-4c59-ad0d-0835b90c8231","Type":"ContainerStarted","Data":"c77739ccf3876013f8877d9d5d3a501349ffcfb5fc6c5b55ab84b235aa541c03"} Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.126081 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-664844745f-dvzxg" event={"ID":"56d9e279-5942-4a24-84db-5d7f8fcabcba","Type":"ContainerStarted","Data":"0ec60f88df5449c5db112899ad63d9996659768fbe87f74b291ec748896f9322"} Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.132525 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-594d76bc86-m9c6m" event={"ID":"ea2e502f-f902-43be-989a-2f0ed4e3ae02","Type":"ContainerStarted","Data":"69420b6bc5967e8a9c94a09744ab8a77c61b36398ba206f7f0389a4e1bc6a426"} Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.138408 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"20980f0a-4148-4983-8991-ea4563cfbc5a","Type":"ContainerStarted","Data":"4ba4f194e908f0920dc5ec7deffa30535c2f100a35d0955b8887349efca9d864"} Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.150434 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779","Type":"ContainerStarted","Data":"a4559148f6852c76e778a887208ad48edafe31301d7b04b04ef0e1b0ab844138"} Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.156563 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f4b8c987b-kjdd8" event={"ID":"4297247b-64e7-4379-aa35-9e2bf6d2d5d5","Type":"ContainerStarted","Data":"244a449573a9a234cf22d09cbb748d656666273d8bc6494dfd2aa838c497a857"} Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.160377 4825 generic.go:334] "Generic (PLEG): container finished" podID="ce0b553c-e9d5-4613-a202-65fc185f60b4" containerID="9ebeaf10a1e81bb05f85ae36561a723f2651bcc0e24f9ede4a6352970c6608db" exitCode=0 Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.160441 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" event={"ID":"ce0b553c-e9d5-4613-a202-65fc185f60b4","Type":"ContainerDied","Data":"9ebeaf10a1e81bb05f85ae36561a723f2651bcc0e24f9ede4a6352970c6608db"} Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.160465 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" event={"ID":"ce0b553c-e9d5-4613-a202-65fc185f60b4","Type":"ContainerStarted","Data":"2d54a3abebec3d9990f0bf446b54246d618db3ba595eaefa9f0b27ee078b5fd0"} Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.166670 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.166649592 podStartE2EDuration="9.166649592s" podCreationTimestamp="2025-10-07 19:18:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:18:58.156395876 +0000 UTC m=+1126.978434513" watchObservedRunningTime="2025-10-07 19:18:58.166649592 +0000 UTC m=+1126.988688229" Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.170449 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d67bd544-4s2q8" event={"ID":"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e","Type":"ContainerStarted","Data":"430b8417f7069c5ac78b65ba019fba33cf881d8bc2f3c7b4a4613012d49c9aab"} Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.172296 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6b848888b7-8bpk8" event={"ID":"a0cf82d8-414d-4486-9cef-be5b38e75745","Type":"ContainerStarted","Data":"70a51bc17d6bfbabb48af6ec865211356f1f710ab79b810c3b7a1926379ff5e8"} Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.172343 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6b848888b7-8bpk8" event={"ID":"a0cf82d8-414d-4486-9cef-be5b38e75745","Type":"ContainerStarted","Data":"e73531bcabc5d4b28cdcfd9d7c41743de34ac7016050b921d429f03644c92ffb"} Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.172646 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.181852 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=10.181837194 podStartE2EDuration="10.181837194s" podCreationTimestamp="2025-10-07 19:18:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:18:58.181467032 +0000 UTC m=+1127.003505669" watchObservedRunningTime="2025-10-07 19:18:58.181837194 +0000 UTC m=+1127.003875821" Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.285923 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6b848888b7-8bpk8" podStartSLOduration=3.285902859 podStartE2EDuration="3.285902859s" podCreationTimestamp="2025-10-07 19:18:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:18:58.265077697 +0000 UTC m=+1127.087116334" watchObservedRunningTime="2025-10-07 19:18:58.285902859 +0000 UTC m=+1127.107941496" Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.553217 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.632338 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mqlb\" (UniqueName: \"kubernetes.io/projected/aac266a4-2c85-420b-b54c-7e9527761052-kube-api-access-9mqlb\") pod \"aac266a4-2c85-420b-b54c-7e9527761052\" (UID: \"aac266a4-2c85-420b-b54c-7e9527761052\") " Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.632409 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-ovsdbserver-nb\") pod \"aac266a4-2c85-420b-b54c-7e9527761052\" (UID: \"aac266a4-2c85-420b-b54c-7e9527761052\") " Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.632528 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-config\") pod \"aac266a4-2c85-420b-b54c-7e9527761052\" (UID: \"aac266a4-2c85-420b-b54c-7e9527761052\") " Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.632552 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-dns-swift-storage-0\") pod \"aac266a4-2c85-420b-b54c-7e9527761052\" (UID: \"aac266a4-2c85-420b-b54c-7e9527761052\") " Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.632567 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-dns-svc\") pod \"aac266a4-2c85-420b-b54c-7e9527761052\" (UID: \"aac266a4-2c85-420b-b54c-7e9527761052\") " Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.632587 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-ovsdbserver-sb\") pod \"aac266a4-2c85-420b-b54c-7e9527761052\" (UID: \"aac266a4-2c85-420b-b54c-7e9527761052\") " Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.650887 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aac266a4-2c85-420b-b54c-7e9527761052-kube-api-access-9mqlb" (OuterVolumeSpecName: "kube-api-access-9mqlb") pod "aac266a4-2c85-420b-b54c-7e9527761052" (UID: "aac266a4-2c85-420b-b54c-7e9527761052"). InnerVolumeSpecName "kube-api-access-9mqlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.706616 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aac266a4-2c85-420b-b54c-7e9527761052" (UID: "aac266a4-2c85-420b-b54c-7e9527761052"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.720667 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aac266a4-2c85-420b-b54c-7e9527761052" (UID: "aac266a4-2c85-420b-b54c-7e9527761052"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.736555 4825 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.736586 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.736598 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mqlb\" (UniqueName: \"kubernetes.io/projected/aac266a4-2c85-420b-b54c-7e9527761052-kube-api-access-9mqlb\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.736968 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aac266a4-2c85-420b-b54c-7e9527761052" (UID: "aac266a4-2c85-420b-b54c-7e9527761052"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.739636 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aac266a4-2c85-420b-b54c-7e9527761052" (UID: "aac266a4-2c85-420b-b54c-7e9527761052"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.748779 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-config" (OuterVolumeSpecName: "config") pod "aac266a4-2c85-420b-b54c-7e9527761052" (UID: "aac266a4-2c85-420b-b54c-7e9527761052"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.838763 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.838797 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.838811 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aac266a4-2c85-420b-b54c-7e9527761052-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.945982 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-657f997574-lnlbm"] Oct 07 19:18:58 crc kubenswrapper[4825]: E1007 19:18:58.946364 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aac266a4-2c85-420b-b54c-7e9527761052" containerName="init" Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.946379 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="aac266a4-2c85-420b-b54c-7e9527761052" containerName="init" Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.948056 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="aac266a4-2c85-420b-b54c-7e9527761052" containerName="init" Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.949068 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.951176 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 07 19:18:58 crc kubenswrapper[4825]: I1007 19:18:58.951468 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:58.960327 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-657f997574-lnlbm"] Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.041353 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc023b5f-d12b-4ce6-9cc6-1bac1fa48455-config-data\") pod \"barbican-api-657f997574-lnlbm\" (UID: \"dc023b5f-d12b-4ce6-9cc6-1bac1fa48455\") " pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.041512 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzzwl\" (UniqueName: \"kubernetes.io/projected/dc023b5f-d12b-4ce6-9cc6-1bac1fa48455-kube-api-access-fzzwl\") pod \"barbican-api-657f997574-lnlbm\" (UID: \"dc023b5f-d12b-4ce6-9cc6-1bac1fa48455\") " pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.041544 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc023b5f-d12b-4ce6-9cc6-1bac1fa48455-logs\") pod \"barbican-api-657f997574-lnlbm\" (UID: \"dc023b5f-d12b-4ce6-9cc6-1bac1fa48455\") " pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.041581 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc023b5f-d12b-4ce6-9cc6-1bac1fa48455-combined-ca-bundle\") pod \"barbican-api-657f997574-lnlbm\" (UID: \"dc023b5f-d12b-4ce6-9cc6-1bac1fa48455\") " pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.041660 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc023b5f-d12b-4ce6-9cc6-1bac1fa48455-public-tls-certs\") pod \"barbican-api-657f997574-lnlbm\" (UID: \"dc023b5f-d12b-4ce6-9cc6-1bac1fa48455\") " pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.041695 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc023b5f-d12b-4ce6-9cc6-1bac1fa48455-internal-tls-certs\") pod \"barbican-api-657f997574-lnlbm\" (UID: \"dc023b5f-d12b-4ce6-9cc6-1bac1fa48455\") " pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.041727 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc023b5f-d12b-4ce6-9cc6-1bac1fa48455-config-data-custom\") pod \"barbican-api-657f997574-lnlbm\" (UID: \"dc023b5f-d12b-4ce6-9cc6-1bac1fa48455\") " pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.144055 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzzwl\" (UniqueName: \"kubernetes.io/projected/dc023b5f-d12b-4ce6-9cc6-1bac1fa48455-kube-api-access-fzzwl\") pod \"barbican-api-657f997574-lnlbm\" (UID: \"dc023b5f-d12b-4ce6-9cc6-1bac1fa48455\") " pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.144372 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc023b5f-d12b-4ce6-9cc6-1bac1fa48455-logs\") pod \"barbican-api-657f997574-lnlbm\" (UID: \"dc023b5f-d12b-4ce6-9cc6-1bac1fa48455\") " pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.144408 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc023b5f-d12b-4ce6-9cc6-1bac1fa48455-combined-ca-bundle\") pod \"barbican-api-657f997574-lnlbm\" (UID: \"dc023b5f-d12b-4ce6-9cc6-1bac1fa48455\") " pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.144474 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc023b5f-d12b-4ce6-9cc6-1bac1fa48455-public-tls-certs\") pod \"barbican-api-657f997574-lnlbm\" (UID: \"dc023b5f-d12b-4ce6-9cc6-1bac1fa48455\") " pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.144498 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc023b5f-d12b-4ce6-9cc6-1bac1fa48455-internal-tls-certs\") pod \"barbican-api-657f997574-lnlbm\" (UID: \"dc023b5f-d12b-4ce6-9cc6-1bac1fa48455\") " pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.144516 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc023b5f-d12b-4ce6-9cc6-1bac1fa48455-config-data-custom\") pod \"barbican-api-657f997574-lnlbm\" (UID: \"dc023b5f-d12b-4ce6-9cc6-1bac1fa48455\") " pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.144545 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc023b5f-d12b-4ce6-9cc6-1bac1fa48455-config-data\") pod \"barbican-api-657f997574-lnlbm\" (UID: \"dc023b5f-d12b-4ce6-9cc6-1bac1fa48455\") " pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.144971 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc023b5f-d12b-4ce6-9cc6-1bac1fa48455-logs\") pod \"barbican-api-657f997574-lnlbm\" (UID: \"dc023b5f-d12b-4ce6-9cc6-1bac1fa48455\") " pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.149525 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc023b5f-d12b-4ce6-9cc6-1bac1fa48455-combined-ca-bundle\") pod \"barbican-api-657f997574-lnlbm\" (UID: \"dc023b5f-d12b-4ce6-9cc6-1bac1fa48455\") " pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.152490 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc023b5f-d12b-4ce6-9cc6-1bac1fa48455-config-data-custom\") pod \"barbican-api-657f997574-lnlbm\" (UID: \"dc023b5f-d12b-4ce6-9cc6-1bac1fa48455\") " pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.155480 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc023b5f-d12b-4ce6-9cc6-1bac1fa48455-config-data\") pod \"barbican-api-657f997574-lnlbm\" (UID: \"dc023b5f-d12b-4ce6-9cc6-1bac1fa48455\") " pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.155796 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc023b5f-d12b-4ce6-9cc6-1bac1fa48455-public-tls-certs\") pod \"barbican-api-657f997574-lnlbm\" (UID: \"dc023b5f-d12b-4ce6-9cc6-1bac1fa48455\") " pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.156191 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc023b5f-d12b-4ce6-9cc6-1bac1fa48455-internal-tls-certs\") pod \"barbican-api-657f997574-lnlbm\" (UID: \"dc023b5f-d12b-4ce6-9cc6-1bac1fa48455\") " pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.166854 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzzwl\" (UniqueName: \"kubernetes.io/projected/dc023b5f-d12b-4ce6-9cc6-1bac1fa48455-kube-api-access-fzzwl\") pod \"barbican-api-657f997574-lnlbm\" (UID: \"dc023b5f-d12b-4ce6-9cc6-1bac1fa48455\") " pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.193412 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" event={"ID":"ce0b553c-e9d5-4613-a202-65fc185f60b4","Type":"ContainerStarted","Data":"cafba9783489bfa5a2752f102a3716c76caae423dd52d24982badba596f37317"} Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.193488 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.207137 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d67bd544-4s2q8" event={"ID":"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e","Type":"ContainerStarted","Data":"09ada7c56e9f60831da9c75e12cab1b4387bfb80b9cfe3fb9b464fa7d29f7d7b"} Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.208113 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-d67bd544-4s2q8" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.214521 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-578479f95d-ht4w8" event={"ID":"6db17e9a-ec51-4c59-ad0d-0835b90c8231","Type":"ContainerStarted","Data":"6cc2d532cfcc18c8cb9531818e4d3f565130eb88bb11eebcd3d82a23f350d9bf"} Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.214834 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" podStartSLOduration=4.214813076 podStartE2EDuration="4.214813076s" podCreationTimestamp="2025-10-07 19:18:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:18:59.212779372 +0000 UTC m=+1128.034818009" watchObservedRunningTime="2025-10-07 19:18:59.214813076 +0000 UTC m=+1128.036851713" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.215377 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-578479f95d-ht4w8" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.215406 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-578479f95d-ht4w8" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.220007 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" event={"ID":"aac266a4-2c85-420b-b54c-7e9527761052","Type":"ContainerDied","Data":"aa2011be69d952e3f2bee8ff60b0351b3b042d8496b50ec0dae17f9b1914c066"} Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.220058 4825 scope.go:117] "RemoveContainer" containerID="44805a0ca7fc90a7633de2a67370356c7ce7d5627f34d33fddf8783a678c0c4e" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.220055 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-rgbxc" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.231256 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-d67bd544-4s2q8" podStartSLOduration=9.231217357 podStartE2EDuration="9.231217357s" podCreationTimestamp="2025-10-07 19:18:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:18:59.230921588 +0000 UTC m=+1128.052960225" watchObservedRunningTime="2025-10-07 19:18:59.231217357 +0000 UTC m=+1128.053255994" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.239299 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f4b8c987b-kjdd8" event={"ID":"4297247b-64e7-4379-aa35-9e2bf6d2d5d5","Type":"ContainerStarted","Data":"57da2b357883c5454055f316f05a2615f4eff2ef61d402810fc44ac1429f416f"} Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.239537 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.249932 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.273444 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-578479f95d-ht4w8" podStartSLOduration=4.273424948 podStartE2EDuration="4.273424948s" podCreationTimestamp="2025-10-07 19:18:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:18:59.255776287 +0000 UTC m=+1128.077814924" watchObservedRunningTime="2025-10-07 19:18:59.273424948 +0000 UTC m=+1128.095463605" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.286790 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7f4b8c987b-kjdd8" podStartSLOduration=4.286772471 podStartE2EDuration="4.286772471s" podCreationTimestamp="2025-10-07 19:18:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:18:59.277869848 +0000 UTC m=+1128.099908495" watchObservedRunningTime="2025-10-07 19:18:59.286772471 +0000 UTC m=+1128.108811108" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.298456 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.316024 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.316071 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.329080 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-rgbxc"] Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.335471 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-rgbxc"] Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.388567 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.389151 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 19:18:59 crc kubenswrapper[4825]: I1007 19:18:59.808973 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aac266a4-2c85-420b-b54c-7e9527761052" path="/var/lib/kubelet/pods/aac266a4-2c85-420b-b54c-7e9527761052/volumes" Oct 07 19:19:00 crc kubenswrapper[4825]: I1007 19:19:00.278463 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vlkds" event={"ID":"ba465067-0e79-4d52-bc56-a4b60767eb7d","Type":"ContainerStarted","Data":"e2453075a44cf7eb404e3853642aeab5fe6820e2fbc4aed966e4b20a5ca29edd"} Oct 07 19:19:00 crc kubenswrapper[4825]: I1007 19:19:00.279919 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 19:19:00 crc kubenswrapper[4825]: I1007 19:19:00.279939 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 19:19:00 crc kubenswrapper[4825]: I1007 19:19:00.297887 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-vlkds" podStartSLOduration=4.315835406 podStartE2EDuration="53.297869669s" podCreationTimestamp="2025-10-07 19:18:07 +0000 UTC" firstStartedPulling="2025-10-07 19:18:09.058066491 +0000 UTC m=+1077.880105128" lastFinishedPulling="2025-10-07 19:18:58.040100754 +0000 UTC m=+1126.862139391" observedRunningTime="2025-10-07 19:19:00.296668881 +0000 UTC m=+1129.118707508" watchObservedRunningTime="2025-10-07 19:19:00.297869669 +0000 UTC m=+1129.119908306" Oct 07 19:19:00 crc kubenswrapper[4825]: I1007 19:19:00.298875 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 19:19:00 crc kubenswrapper[4825]: I1007 19:19:00.299805 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 19:19:00 crc kubenswrapper[4825]: I1007 19:19:00.371213 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 19:19:00 crc kubenswrapper[4825]: I1007 19:19:00.377709 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 19:19:00 crc kubenswrapper[4825]: I1007 19:19:00.416800 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-566c6c8d88-h74t9" podUID="b66fe3a9-9849-4219-badb-a0cecbb2a388" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Oct 07 19:19:00 crc kubenswrapper[4825]: I1007 19:19:00.482012 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-58d7dd5b56-nhlgz" podUID="710a139f-bf12-4021-b702-3e40d49febf1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Oct 07 19:19:00 crc kubenswrapper[4825]: I1007 19:19:00.552552 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-657f997574-lnlbm"] Oct 07 19:19:00 crc kubenswrapper[4825]: W1007 19:19:00.577987 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc023b5f_d12b_4ce6_9cc6_1bac1fa48455.slice/crio-c4cde1dff1e7c037eb7a0d2341a0da3445cdd48b1389b810d9df987e9a93e643 WatchSource:0}: Error finding container c4cde1dff1e7c037eb7a0d2341a0da3445cdd48b1389b810d9df987e9a93e643: Status 404 returned error can't find the container with id c4cde1dff1e7c037eb7a0d2341a0da3445cdd48b1389b810d9df987e9a93e643 Oct 07 19:19:01 crc kubenswrapper[4825]: I1007 19:19:01.294910 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-664844745f-dvzxg" event={"ID":"56d9e279-5942-4a24-84db-5d7f8fcabcba","Type":"ContainerStarted","Data":"e95297c1aded37e6857da8fd5c417a1eb1fefb421d00c6034644cb7fe650910f"} Oct 07 19:19:01 crc kubenswrapper[4825]: I1007 19:19:01.295289 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-664844745f-dvzxg" event={"ID":"56d9e279-5942-4a24-84db-5d7f8fcabcba","Type":"ContainerStarted","Data":"a3ee3f9ccde965424707bb95b5c066e1517ea3b0f9b1f824eecfbaa41d3ecff4"} Oct 07 19:19:01 crc kubenswrapper[4825]: I1007 19:19:01.306140 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-594d76bc86-m9c6m" event={"ID":"ea2e502f-f902-43be-989a-2f0ed4e3ae02","Type":"ContainerStarted","Data":"ec92157c2494ec69ad2639e99bac6f24eae9b8e0cc82156c0a3c393494d6b612"} Oct 07 19:19:01 crc kubenswrapper[4825]: I1007 19:19:01.306182 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-594d76bc86-m9c6m" event={"ID":"ea2e502f-f902-43be-989a-2f0ed4e3ae02","Type":"ContainerStarted","Data":"31f3c711e9d35d6cdedabc17af0225fcd4c743b2c662f5b2ea32a4111f750e55"} Oct 07 19:19:01 crc kubenswrapper[4825]: I1007 19:19:01.319961 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-657f997574-lnlbm" event={"ID":"dc023b5f-d12b-4ce6-9cc6-1bac1fa48455","Type":"ContainerStarted","Data":"f8eacc68352958b4319fd2054eca117cf6d2fbf8806af6534cb2c42c98378fde"} Oct 07 19:19:01 crc kubenswrapper[4825]: I1007 19:19:01.320006 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:19:01 crc kubenswrapper[4825]: I1007 19:19:01.320017 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-657f997574-lnlbm" event={"ID":"dc023b5f-d12b-4ce6-9cc6-1bac1fa48455","Type":"ContainerStarted","Data":"e13808f148f224b6e85f46848f9eae304b393854c7cb54713d04b1327c676216"} Oct 07 19:19:01 crc kubenswrapper[4825]: I1007 19:19:01.320030 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-657f997574-lnlbm" event={"ID":"dc023b5f-d12b-4ce6-9cc6-1bac1fa48455","Type":"ContainerStarted","Data":"c4cde1dff1e7c037eb7a0d2341a0da3445cdd48b1389b810d9df987e9a93e643"} Oct 07 19:19:01 crc kubenswrapper[4825]: I1007 19:19:01.322336 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 19:19:01 crc kubenswrapper[4825]: I1007 19:19:01.322360 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:19:01 crc kubenswrapper[4825]: I1007 19:19:01.322370 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 19:19:01 crc kubenswrapper[4825]: I1007 19:19:01.328897 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-664844745f-dvzxg" podStartSLOduration=3.521983385 podStartE2EDuration="6.328881589s" podCreationTimestamp="2025-10-07 19:18:55 +0000 UTC" firstStartedPulling="2025-10-07 19:18:57.279217391 +0000 UTC m=+1126.101256028" lastFinishedPulling="2025-10-07 19:19:00.086115595 +0000 UTC m=+1128.908154232" observedRunningTime="2025-10-07 19:19:01.325553393 +0000 UTC m=+1130.147592060" watchObservedRunningTime="2025-10-07 19:19:01.328881589 +0000 UTC m=+1130.150920226" Oct 07 19:19:01 crc kubenswrapper[4825]: I1007 19:19:01.353405 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-657f997574-lnlbm" podStartSLOduration=3.3533889869999998 podStartE2EDuration="3.353388987s" podCreationTimestamp="2025-10-07 19:18:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:19:01.351172217 +0000 UTC m=+1130.173210854" watchObservedRunningTime="2025-10-07 19:19:01.353388987 +0000 UTC m=+1130.175427624" Oct 07 19:19:02 crc kubenswrapper[4825]: I1007 19:19:02.333069 4825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 19:19:02 crc kubenswrapper[4825]: I1007 19:19:02.589306 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 19:19:02 crc kubenswrapper[4825]: I1007 19:19:02.618478 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-594d76bc86-m9c6m" podStartSLOduration=4.8161377210000005 podStartE2EDuration="7.618458129s" podCreationTimestamp="2025-10-07 19:18:55 +0000 UTC" firstStartedPulling="2025-10-07 19:18:57.284067226 +0000 UTC m=+1126.106105863" lastFinishedPulling="2025-10-07 19:19:00.086387634 +0000 UTC m=+1128.908426271" observedRunningTime="2025-10-07 19:19:01.377989558 +0000 UTC m=+1130.200028195" watchObservedRunningTime="2025-10-07 19:19:02.618458129 +0000 UTC m=+1131.440496766" Oct 07 19:19:03 crc kubenswrapper[4825]: I1007 19:19:03.342099 4825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 19:19:03 crc kubenswrapper[4825]: I1007 19:19:03.590686 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 19:19:04 crc kubenswrapper[4825]: I1007 19:19:04.143019 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 19:19:04 crc kubenswrapper[4825]: I1007 19:19:04.265791 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 19:19:05 crc kubenswrapper[4825]: I1007 19:19:05.373209 4825 generic.go:334] "Generic (PLEG): container finished" podID="ba465067-0e79-4d52-bc56-a4b60767eb7d" containerID="e2453075a44cf7eb404e3853642aeab5fe6820e2fbc4aed966e4b20a5ca29edd" exitCode=0 Oct 07 19:19:05 crc kubenswrapper[4825]: I1007 19:19:05.373278 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vlkds" event={"ID":"ba465067-0e79-4d52-bc56-a4b60767eb7d","Type":"ContainerDied","Data":"e2453075a44cf7eb404e3853642aeab5fe6820e2fbc4aed966e4b20a5ca29edd"} Oct 07 19:19:06 crc kubenswrapper[4825]: I1007 19:19:06.154413 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" Oct 07 19:19:06 crc kubenswrapper[4825]: I1007 19:19:06.231096 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-thd5m"] Oct 07 19:19:06 crc kubenswrapper[4825]: I1007 19:19:06.231630 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" podUID="6df6a4eb-4cdf-4080-98bb-1b23e46755cb" containerName="dnsmasq-dns" containerID="cri-o://03374d2ad036aa2e6ad45f8e20c41a857fc989a03e2bed79eae350e12f365976" gracePeriod=10 Oct 07 19:19:07 crc kubenswrapper[4825]: I1007 19:19:07.402720 4825 generic.go:334] "Generic (PLEG): container finished" podID="6df6a4eb-4cdf-4080-98bb-1b23e46755cb" containerID="03374d2ad036aa2e6ad45f8e20c41a857fc989a03e2bed79eae350e12f365976" exitCode=0 Oct 07 19:19:07 crc kubenswrapper[4825]: I1007 19:19:07.402776 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" event={"ID":"6df6a4eb-4cdf-4080-98bb-1b23e46755cb","Type":"ContainerDied","Data":"03374d2ad036aa2e6ad45f8e20c41a857fc989a03e2bed79eae350e12f365976"} Oct 07 19:19:07 crc kubenswrapper[4825]: I1007 19:19:07.717469 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-578479f95d-ht4w8" Oct 07 19:19:07 crc kubenswrapper[4825]: I1007 19:19:07.736062 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-578479f95d-ht4w8" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.384640 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vlkds" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.472757 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vlkds" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.472796 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vlkds" event={"ID":"ba465067-0e79-4d52-bc56-a4b60767eb7d","Type":"ContainerDied","Data":"6444a930ea0d25a394f49662cf1474644cf9dc30fddaf5cf1d27f748730f25f5"} Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.472817 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6444a930ea0d25a394f49662cf1474644cf9dc30fddaf5cf1d27f748730f25f5" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.482712 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ba465067-0e79-4d52-bc56-a4b60767eb7d-db-sync-config-data\") pod \"ba465067-0e79-4d52-bc56-a4b60767eb7d\" (UID: \"ba465067-0e79-4d52-bc56-a4b60767eb7d\") " Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.482765 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba465067-0e79-4d52-bc56-a4b60767eb7d-etc-machine-id\") pod \"ba465067-0e79-4d52-bc56-a4b60767eb7d\" (UID: \"ba465067-0e79-4d52-bc56-a4b60767eb7d\") " Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.482836 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba465067-0e79-4d52-bc56-a4b60767eb7d-scripts\") pod \"ba465067-0e79-4d52-bc56-a4b60767eb7d\" (UID: \"ba465067-0e79-4d52-bc56-a4b60767eb7d\") " Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.482897 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrmz2\" (UniqueName: \"kubernetes.io/projected/ba465067-0e79-4d52-bc56-a4b60767eb7d-kube-api-access-zrmz2\") pod \"ba465067-0e79-4d52-bc56-a4b60767eb7d\" (UID: \"ba465067-0e79-4d52-bc56-a4b60767eb7d\") " Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.482933 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba465067-0e79-4d52-bc56-a4b60767eb7d-combined-ca-bundle\") pod \"ba465067-0e79-4d52-bc56-a4b60767eb7d\" (UID: \"ba465067-0e79-4d52-bc56-a4b60767eb7d\") " Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.482962 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba465067-0e79-4d52-bc56-a4b60767eb7d-config-data\") pod \"ba465067-0e79-4d52-bc56-a4b60767eb7d\" (UID: \"ba465067-0e79-4d52-bc56-a4b60767eb7d\") " Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.488408 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ba465067-0e79-4d52-bc56-a4b60767eb7d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ba465067-0e79-4d52-bc56-a4b60767eb7d" (UID: "ba465067-0e79-4d52-bc56-a4b60767eb7d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.503463 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba465067-0e79-4d52-bc56-a4b60767eb7d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ba465067-0e79-4d52-bc56-a4b60767eb7d" (UID: "ba465067-0e79-4d52-bc56-a4b60767eb7d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.507250 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba465067-0e79-4d52-bc56-a4b60767eb7d-kube-api-access-zrmz2" (OuterVolumeSpecName: "kube-api-access-zrmz2") pod "ba465067-0e79-4d52-bc56-a4b60767eb7d" (UID: "ba465067-0e79-4d52-bc56-a4b60767eb7d"). InnerVolumeSpecName "kube-api-access-zrmz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.530787 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba465067-0e79-4d52-bc56-a4b60767eb7d-scripts" (OuterVolumeSpecName: "scripts") pod "ba465067-0e79-4d52-bc56-a4b60767eb7d" (UID: "ba465067-0e79-4d52-bc56-a4b60767eb7d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.570104 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba465067-0e79-4d52-bc56-a4b60767eb7d-config-data" (OuterVolumeSpecName: "config-data") pod "ba465067-0e79-4d52-bc56-a4b60767eb7d" (UID: "ba465067-0e79-4d52-bc56-a4b60767eb7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.581719 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.585275 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrmz2\" (UniqueName: \"kubernetes.io/projected/ba465067-0e79-4d52-bc56-a4b60767eb7d-kube-api-access-zrmz2\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.585298 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba465067-0e79-4d52-bc56-a4b60767eb7d-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.585307 4825 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ba465067-0e79-4d52-bc56-a4b60767eb7d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.585317 4825 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba465067-0e79-4d52-bc56-a4b60767eb7d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.585324 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba465067-0e79-4d52-bc56-a4b60767eb7d-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.587308 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba465067-0e79-4d52-bc56-a4b60767eb7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba465067-0e79-4d52-bc56-a4b60767eb7d" (UID: "ba465067-0e79-4d52-bc56-a4b60767eb7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:08 crc kubenswrapper[4825]: E1007 19:19:08.646064 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="5ec7dda1-a8ec-4aa6-a3be-25c200b51d15" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.686887 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-config\") pod \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\" (UID: \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\") " Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.686977 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-ovsdbserver-sb\") pod \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\" (UID: \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\") " Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.687040 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-dns-svc\") pod \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\" (UID: \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\") " Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.687065 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-dns-swift-storage-0\") pod \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\" (UID: \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\") " Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.687553 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-ovsdbserver-nb\") pod \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\" (UID: \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\") " Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.687592 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpmdm\" (UniqueName: \"kubernetes.io/projected/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-kube-api-access-tpmdm\") pod \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\" (UID: \"6df6a4eb-4cdf-4080-98bb-1b23e46755cb\") " Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.688526 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba465067-0e79-4d52-bc56-a4b60767eb7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.691358 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-kube-api-access-tpmdm" (OuterVolumeSpecName: "kube-api-access-tpmdm") pod "6df6a4eb-4cdf-4080-98bb-1b23e46755cb" (UID: "6df6a4eb-4cdf-4080-98bb-1b23e46755cb"). InnerVolumeSpecName "kube-api-access-tpmdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.736511 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-config" (OuterVolumeSpecName: "config") pod "6df6a4eb-4cdf-4080-98bb-1b23e46755cb" (UID: "6df6a4eb-4cdf-4080-98bb-1b23e46755cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.739388 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6df6a4eb-4cdf-4080-98bb-1b23e46755cb" (UID: "6df6a4eb-4cdf-4080-98bb-1b23e46755cb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.753785 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6df6a4eb-4cdf-4080-98bb-1b23e46755cb" (UID: "6df6a4eb-4cdf-4080-98bb-1b23e46755cb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.769835 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6df6a4eb-4cdf-4080-98bb-1b23e46755cb" (UID: "6df6a4eb-4cdf-4080-98bb-1b23e46755cb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.775085 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6df6a4eb-4cdf-4080-98bb-1b23e46755cb" (UID: "6df6a4eb-4cdf-4080-98bb-1b23e46755cb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.790692 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.790879 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpmdm\" (UniqueName: \"kubernetes.io/projected/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-kube-api-access-tpmdm\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.790905 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.790924 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.790942 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:08 crc kubenswrapper[4825]: I1007 19:19:08.790959 4825 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6df6a4eb-4cdf-4080-98bb-1b23e46755cb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.484895 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.484972 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-thd5m" event={"ID":"6df6a4eb-4cdf-4080-98bb-1b23e46755cb","Type":"ContainerDied","Data":"aa45d2bc4166c4326df69a017da6fdd01cc64945b71ca15244bf4cf4875b1675"} Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.485165 4825 scope.go:117] "RemoveContainer" containerID="03374d2ad036aa2e6ad45f8e20c41a857fc989a03e2bed79eae350e12f365976" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.489171 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15","Type":"ContainerStarted","Data":"20c68df7466884e12ae6674761a3ac7f6ed8c81c47c4ea3622c52412feb93e23"} Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.489355 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ec7dda1-a8ec-4aa6-a3be-25c200b51d15" containerName="ceilometer-notification-agent" containerID="cri-o://ab1230628bfa9906331a75a996428b1e675abe1e75bda8d046a0a80ebd77346f" gracePeriod=30 Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.489635 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.489674 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ec7dda1-a8ec-4aa6-a3be-25c200b51d15" containerName="sg-core" containerID="cri-o://112c82a4d6940b91671a4ea3bc2dfa34faa8090b62c0f942517318ecf5fcd0a1" gracePeriod=30 Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.489700 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ec7dda1-a8ec-4aa6-a3be-25c200b51d15" containerName="proxy-httpd" containerID="cri-o://20c68df7466884e12ae6674761a3ac7f6ed8c81c47c4ea3622c52412feb93e23" gracePeriod=30 Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.513015 4825 scope.go:117] "RemoveContainer" containerID="e8ce30b5917d42b82d34b2604d3a74fc9898f44bc2d2ec1715e74b921c8fc0bd" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.557284 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-thd5m"] Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.561078 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-thd5m"] Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.726604 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 19:19:09 crc kubenswrapper[4825]: E1007 19:19:09.727050 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba465067-0e79-4d52-bc56-a4b60767eb7d" containerName="cinder-db-sync" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.727074 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba465067-0e79-4d52-bc56-a4b60767eb7d" containerName="cinder-db-sync" Oct 07 19:19:09 crc kubenswrapper[4825]: E1007 19:19:09.727107 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6df6a4eb-4cdf-4080-98bb-1b23e46755cb" containerName="init" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.727115 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6df6a4eb-4cdf-4080-98bb-1b23e46755cb" containerName="init" Oct 07 19:19:09 crc kubenswrapper[4825]: E1007 19:19:09.727132 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6df6a4eb-4cdf-4080-98bb-1b23e46755cb" containerName="dnsmasq-dns" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.727140 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6df6a4eb-4cdf-4080-98bb-1b23e46755cb" containerName="dnsmasq-dns" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.727339 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba465067-0e79-4d52-bc56-a4b60767eb7d" containerName="cinder-db-sync" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.727360 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="6df6a4eb-4cdf-4080-98bb-1b23e46755cb" containerName="dnsmasq-dns" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.728262 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.735518 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.735787 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-25xcg" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.735926 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.736045 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.751525 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.769079 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-z8gvh"] Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.770503 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.813878 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-scripts\") pod \"cinder-scheduler-0\" (UID: \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.813926 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.814008 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs8lg\" (UniqueName: \"kubernetes.io/projected/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-kube-api-access-cs8lg\") pod \"cinder-scheduler-0\" (UID: \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.814028 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.814054 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-config-data\") pod \"cinder-scheduler-0\" (UID: \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.814154 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.820071 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6df6a4eb-4cdf-4080-98bb-1b23e46755cb" path="/var/lib/kubelet/pods/6df6a4eb-4cdf-4080-98bb-1b23e46755cb/volumes" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.820635 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-z8gvh"] Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.904476 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.927797 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.927884 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfqlc\" (UniqueName: \"kubernetes.io/projected/a011ec14-9f09-4d4a-98e3-607190afaaa9-kube-api-access-pfqlc\") pod \"dnsmasq-dns-5784cf869f-z8gvh\" (UID: \"a011ec14-9f09-4d4a-98e3-607190afaaa9\") " pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.927914 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-z8gvh\" (UID: \"a011ec14-9f09-4d4a-98e3-607190afaaa9\") " pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.927998 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs8lg\" (UniqueName: \"kubernetes.io/projected/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-kube-api-access-cs8lg\") pod \"cinder-scheduler-0\" (UID: \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.928032 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.928080 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-config\") pod \"dnsmasq-dns-5784cf869f-z8gvh\" (UID: \"a011ec14-9f09-4d4a-98e3-607190afaaa9\") " pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.928119 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-config-data\") pod \"cinder-scheduler-0\" (UID: \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.928174 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.928211 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-dns-svc\") pod \"dnsmasq-dns-5784cf869f-z8gvh\" (UID: \"a011ec14-9f09-4d4a-98e3-607190afaaa9\") " pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.928255 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-z8gvh\" (UID: \"a011ec14-9f09-4d4a-98e3-607190afaaa9\") " pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.928295 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-z8gvh\" (UID: \"a011ec14-9f09-4d4a-98e3-607190afaaa9\") " pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.928373 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-scripts\") pod \"cinder-scheduler-0\" (UID: \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.928451 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.930331 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.933674 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-config-data\") pod \"cinder-scheduler-0\" (UID: \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.933687 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.933799 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.934089 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.956002 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs8lg\" (UniqueName: \"kubernetes.io/projected/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-kube-api-access-cs8lg\") pod \"cinder-scheduler-0\" (UID: \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.973585 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 19:19:09 crc kubenswrapper[4825]: I1007 19:19:09.989800 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-scripts\") pod \"cinder-scheduler-0\" (UID: \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:10 crc kubenswrapper[4825]: E1007 19:19:10.026324 4825 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ec7dda1_a8ec_4aa6_a3be_25c200b51d15.slice/crio-conmon-20c68df7466884e12ae6674761a3ac7f6ed8c81c47c4ea3622c52412feb93e23.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ec7dda1_a8ec_4aa6_a3be_25c200b51d15.slice/crio-20c68df7466884e12ae6674761a3ac7f6ed8c81c47c4ea3622c52412feb93e23.scope\": RecentStats: unable to find data in memory cache]" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.030521 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a360b8-34a4-46c9-843d-1e16eb594b69-config-data\") pod \"cinder-api-0\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " pod="openstack/cinder-api-0" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.030568 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16a360b8-34a4-46c9-843d-1e16eb594b69-etc-machine-id\") pod \"cinder-api-0\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " pod="openstack/cinder-api-0" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.030607 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfqlc\" (UniqueName: \"kubernetes.io/projected/a011ec14-9f09-4d4a-98e3-607190afaaa9-kube-api-access-pfqlc\") pod \"dnsmasq-dns-5784cf869f-z8gvh\" (UID: \"a011ec14-9f09-4d4a-98e3-607190afaaa9\") " pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.030632 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-z8gvh\" (UID: \"a011ec14-9f09-4d4a-98e3-607190afaaa9\") " pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.030650 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16a360b8-34a4-46c9-843d-1e16eb594b69-scripts\") pod \"cinder-api-0\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " pod="openstack/cinder-api-0" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.030674 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8k5w\" (UniqueName: \"kubernetes.io/projected/16a360b8-34a4-46c9-843d-1e16eb594b69-kube-api-access-p8k5w\") pod \"cinder-api-0\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " pod="openstack/cinder-api-0" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.030704 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a360b8-34a4-46c9-843d-1e16eb594b69-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " pod="openstack/cinder-api-0" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.030724 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-config\") pod \"dnsmasq-dns-5784cf869f-z8gvh\" (UID: \"a011ec14-9f09-4d4a-98e3-607190afaaa9\") " pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.030755 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16a360b8-34a4-46c9-843d-1e16eb594b69-logs\") pod \"cinder-api-0\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " pod="openstack/cinder-api-0" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.030799 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-dns-svc\") pod \"dnsmasq-dns-5784cf869f-z8gvh\" (UID: \"a011ec14-9f09-4d4a-98e3-607190afaaa9\") " pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.030818 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16a360b8-34a4-46c9-843d-1e16eb594b69-config-data-custom\") pod \"cinder-api-0\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " pod="openstack/cinder-api-0" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.030838 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-z8gvh\" (UID: \"a011ec14-9f09-4d4a-98e3-607190afaaa9\") " pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.030868 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-z8gvh\" (UID: \"a011ec14-9f09-4d4a-98e3-607190afaaa9\") " pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.031809 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-config\") pod \"dnsmasq-dns-5784cf869f-z8gvh\" (UID: \"a011ec14-9f09-4d4a-98e3-607190afaaa9\") " pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.031883 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-dns-svc\") pod \"dnsmasq-dns-5784cf869f-z8gvh\" (UID: \"a011ec14-9f09-4d4a-98e3-607190afaaa9\") " pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.032273 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-z8gvh\" (UID: \"a011ec14-9f09-4d4a-98e3-607190afaaa9\") " pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.032555 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-z8gvh\" (UID: \"a011ec14-9f09-4d4a-98e3-607190afaaa9\") " pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.032827 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-z8gvh\" (UID: \"a011ec14-9f09-4d4a-98e3-607190afaaa9\") " pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.049076 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfqlc\" (UniqueName: \"kubernetes.io/projected/a011ec14-9f09-4d4a-98e3-607190afaaa9-kube-api-access-pfqlc\") pod \"dnsmasq-dns-5784cf869f-z8gvh\" (UID: \"a011ec14-9f09-4d4a-98e3-607190afaaa9\") " pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.072777 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.132387 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16a360b8-34a4-46c9-843d-1e16eb594b69-etc-machine-id\") pod \"cinder-api-0\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " pod="openstack/cinder-api-0" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.132487 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16a360b8-34a4-46c9-843d-1e16eb594b69-scripts\") pod \"cinder-api-0\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " pod="openstack/cinder-api-0" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.132531 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8k5w\" (UniqueName: \"kubernetes.io/projected/16a360b8-34a4-46c9-843d-1e16eb594b69-kube-api-access-p8k5w\") pod \"cinder-api-0\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " pod="openstack/cinder-api-0" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.132572 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a360b8-34a4-46c9-843d-1e16eb594b69-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " pod="openstack/cinder-api-0" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.132618 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16a360b8-34a4-46c9-843d-1e16eb594b69-logs\") pod \"cinder-api-0\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " pod="openstack/cinder-api-0" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.132520 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16a360b8-34a4-46c9-843d-1e16eb594b69-etc-machine-id\") pod \"cinder-api-0\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " pod="openstack/cinder-api-0" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.132664 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16a360b8-34a4-46c9-843d-1e16eb594b69-config-data-custom\") pod \"cinder-api-0\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " pod="openstack/cinder-api-0" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.132731 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a360b8-34a4-46c9-843d-1e16eb594b69-config-data\") pod \"cinder-api-0\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " pod="openstack/cinder-api-0" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.133466 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16a360b8-34a4-46c9-843d-1e16eb594b69-logs\") pod \"cinder-api-0\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " pod="openstack/cinder-api-0" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.137938 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16a360b8-34a4-46c9-843d-1e16eb594b69-scripts\") pod \"cinder-api-0\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " pod="openstack/cinder-api-0" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.138980 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16a360b8-34a4-46c9-843d-1e16eb594b69-config-data-custom\") pod \"cinder-api-0\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " pod="openstack/cinder-api-0" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.139447 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a360b8-34a4-46c9-843d-1e16eb594b69-config-data\") pod \"cinder-api-0\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " pod="openstack/cinder-api-0" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.145021 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a360b8-34a4-46c9-843d-1e16eb594b69-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " pod="openstack/cinder-api-0" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.150128 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8k5w\" (UniqueName: \"kubernetes.io/projected/16a360b8-34a4-46c9-843d-1e16eb594b69-kube-api-access-p8k5w\") pod \"cinder-api-0\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " pod="openstack/cinder-api-0" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.157803 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.317340 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.518038 4825 generic.go:334] "Generic (PLEG): container finished" podID="5ec7dda1-a8ec-4aa6-a3be-25c200b51d15" containerID="20c68df7466884e12ae6674761a3ac7f6ed8c81c47c4ea3622c52412feb93e23" exitCode=0 Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.518275 4825 generic.go:334] "Generic (PLEG): container finished" podID="5ec7dda1-a8ec-4aa6-a3be-25c200b51d15" containerID="112c82a4d6940b91671a4ea3bc2dfa34faa8090b62c0f942517318ecf5fcd0a1" exitCode=2 Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.518324 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15","Type":"ContainerDied","Data":"20c68df7466884e12ae6674761a3ac7f6ed8c81c47c4ea3622c52412feb93e23"} Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.518349 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15","Type":"ContainerDied","Data":"112c82a4d6940b91671a4ea3bc2dfa34faa8090b62c0f942517318ecf5fcd0a1"} Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.530422 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.665454 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-z8gvh"] Oct 07 19:19:10 crc kubenswrapper[4825]: W1007 19:19:10.782413 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16a360b8_34a4_46c9_843d_1e16eb594b69.slice/crio-0a59c30171b478bb7a161e6248175d9620376537092256045f0d8198e7ad3043 WatchSource:0}: Error finding container 0a59c30171b478bb7a161e6248175d9620376537092256045f0d8198e7ad3043: Status 404 returned error can't find the container with id 0a59c30171b478bb7a161e6248175d9620376537092256045f0d8198e7ad3043 Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.784067 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 19:19:10 crc kubenswrapper[4825]: I1007 19:19:10.895276 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:19:11 crc kubenswrapper[4825]: I1007 19:19:11.072306 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-657f997574-lnlbm" Oct 07 19:19:11 crc kubenswrapper[4825]: I1007 19:19:11.125884 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-578479f95d-ht4w8"] Oct 07 19:19:11 crc kubenswrapper[4825]: I1007 19:19:11.126104 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-578479f95d-ht4w8" podUID="6db17e9a-ec51-4c59-ad0d-0835b90c8231" containerName="barbican-api-log" containerID="cri-o://348a031751fa888be540da3478681d04b11c301a45d453dc05e08fa79bd63773" gracePeriod=30 Oct 07 19:19:11 crc kubenswrapper[4825]: I1007 19:19:11.126473 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-578479f95d-ht4w8" podUID="6db17e9a-ec51-4c59-ad0d-0835b90c8231" containerName="barbican-api" containerID="cri-o://6cc2d532cfcc18c8cb9531818e4d3f565130eb88bb11eebcd3d82a23f350d9bf" gracePeriod=30 Oct 07 19:19:11 crc kubenswrapper[4825]: I1007 19:19:11.550776 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16a360b8-34a4-46c9-843d-1e16eb594b69","Type":"ContainerStarted","Data":"f657c62e1dbe0affe289c281eec58a1274ac34dd86611f564b798a21ce2797e6"} Oct 07 19:19:11 crc kubenswrapper[4825]: I1007 19:19:11.551091 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16a360b8-34a4-46c9-843d-1e16eb594b69","Type":"ContainerStarted","Data":"0a59c30171b478bb7a161e6248175d9620376537092256045f0d8198e7ad3043"} Oct 07 19:19:11 crc kubenswrapper[4825]: I1007 19:19:11.568441 4825 generic.go:334] "Generic (PLEG): container finished" podID="a011ec14-9f09-4d4a-98e3-607190afaaa9" containerID="f01a6722eccc007ce9f3c8261a662597e8500d2dc54a9e24a94e83a77b90a126" exitCode=0 Oct 07 19:19:11 crc kubenswrapper[4825]: I1007 19:19:11.568539 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" event={"ID":"a011ec14-9f09-4d4a-98e3-607190afaaa9","Type":"ContainerDied","Data":"f01a6722eccc007ce9f3c8261a662597e8500d2dc54a9e24a94e83a77b90a126"} Oct 07 19:19:11 crc kubenswrapper[4825]: I1007 19:19:11.568569 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" event={"ID":"a011ec14-9f09-4d4a-98e3-607190afaaa9","Type":"ContainerStarted","Data":"3d698efb1ed358e624bc95353be0ec6591182d23fb4f58ef04cde08e24373b63"} Oct 07 19:19:11 crc kubenswrapper[4825]: I1007 19:19:11.575636 4825 generic.go:334] "Generic (PLEG): container finished" podID="6db17e9a-ec51-4c59-ad0d-0835b90c8231" containerID="348a031751fa888be540da3478681d04b11c301a45d453dc05e08fa79bd63773" exitCode=143 Oct 07 19:19:11 crc kubenswrapper[4825]: I1007 19:19:11.575701 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-578479f95d-ht4w8" event={"ID":"6db17e9a-ec51-4c59-ad0d-0835b90c8231","Type":"ContainerDied","Data":"348a031751fa888be540da3478681d04b11c301a45d453dc05e08fa79bd63773"} Oct 07 19:19:11 crc kubenswrapper[4825]: I1007 19:19:11.580979 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"660c15b2-9ce0-4ddf-9a41-4a4cc953972d","Type":"ContainerStarted","Data":"c3c32700a8e55a4d14278b09a910c56bbc1ad77dd71d7cf990974b1d9bc78794"} Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.108631 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.212614 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.304813 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-combined-ca-bundle\") pod \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.304891 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5srm\" (UniqueName: \"kubernetes.io/projected/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-kube-api-access-x5srm\") pod \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.304951 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-sg-core-conf-yaml\") pod \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.305031 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-log-httpd\") pod \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.305106 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-scripts\") pod \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.305174 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-run-httpd\") pod \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.305199 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-config-data\") pod \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\" (UID: \"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15\") " Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.305877 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5ec7dda1-a8ec-4aa6-a3be-25c200b51d15" (UID: "5ec7dda1-a8ec-4aa6-a3be-25c200b51d15"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.306170 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5ec7dda1-a8ec-4aa6-a3be-25c200b51d15" (UID: "5ec7dda1-a8ec-4aa6-a3be-25c200b51d15"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.309624 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-scripts" (OuterVolumeSpecName: "scripts") pod "5ec7dda1-a8ec-4aa6-a3be-25c200b51d15" (UID: "5ec7dda1-a8ec-4aa6-a3be-25c200b51d15"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.323949 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-kube-api-access-x5srm" (OuterVolumeSpecName: "kube-api-access-x5srm") pod "5ec7dda1-a8ec-4aa6-a3be-25c200b51d15" (UID: "5ec7dda1-a8ec-4aa6-a3be-25c200b51d15"). InnerVolumeSpecName "kube-api-access-x5srm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.341941 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5ec7dda1-a8ec-4aa6-a3be-25c200b51d15" (UID: "5ec7dda1-a8ec-4aa6-a3be-25c200b51d15"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.388580 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ec7dda1-a8ec-4aa6-a3be-25c200b51d15" (UID: "5ec7dda1-a8ec-4aa6-a3be-25c200b51d15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.407998 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5srm\" (UniqueName: \"kubernetes.io/projected/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-kube-api-access-x5srm\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.408042 4825 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.408053 4825 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.408062 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.408071 4825 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.408079 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.408452 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-config-data" (OuterVolumeSpecName: "config-data") pod "5ec7dda1-a8ec-4aa6-a3be-25c200b51d15" (UID: "5ec7dda1-a8ec-4aa6-a3be-25c200b51d15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.510095 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.604537 4825 generic.go:334] "Generic (PLEG): container finished" podID="5ec7dda1-a8ec-4aa6-a3be-25c200b51d15" containerID="ab1230628bfa9906331a75a996428b1e675abe1e75bda8d046a0a80ebd77346f" exitCode=0 Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.604615 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15","Type":"ContainerDied","Data":"ab1230628bfa9906331a75a996428b1e675abe1e75bda8d046a0a80ebd77346f"} Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.604685 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.604850 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ec7dda1-a8ec-4aa6-a3be-25c200b51d15","Type":"ContainerDied","Data":"ef067aaf69830f4ddb564bc3588e55e04ba8e33c21200559a93c30ce956b0c9a"} Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.604876 4825 scope.go:117] "RemoveContainer" containerID="20c68df7466884e12ae6674761a3ac7f6ed8c81c47c4ea3622c52412feb93e23" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.626277 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" event={"ID":"a011ec14-9f09-4d4a-98e3-607190afaaa9","Type":"ContainerStarted","Data":"5b5584025e6a7261245126fa84d003b6cd1b5d53330331c728dbcf2b937387a3"} Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.626549 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.659663 4825 scope.go:117] "RemoveContainer" containerID="112c82a4d6940b91671a4ea3bc2dfa34faa8090b62c0f942517318ecf5fcd0a1" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.678937 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.691293 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.697061 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" podStartSLOduration=3.697042385 podStartE2EDuration="3.697042385s" podCreationTimestamp="2025-10-07 19:19:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:19:12.675848763 +0000 UTC m=+1141.497887400" watchObservedRunningTime="2025-10-07 19:19:12.697042385 +0000 UTC m=+1141.519081022" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.704717 4825 scope.go:117] "RemoveContainer" containerID="ab1230628bfa9906331a75a996428b1e675abe1e75bda8d046a0a80ebd77346f" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.705386 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:19:12 crc kubenswrapper[4825]: E1007 19:19:12.705935 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec7dda1-a8ec-4aa6-a3be-25c200b51d15" containerName="proxy-httpd" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.705957 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec7dda1-a8ec-4aa6-a3be-25c200b51d15" containerName="proxy-httpd" Oct 07 19:19:12 crc kubenswrapper[4825]: E1007 19:19:12.705973 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec7dda1-a8ec-4aa6-a3be-25c200b51d15" containerName="sg-core" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.705982 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec7dda1-a8ec-4aa6-a3be-25c200b51d15" containerName="sg-core" Oct 07 19:19:12 crc kubenswrapper[4825]: E1007 19:19:12.706068 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec7dda1-a8ec-4aa6-a3be-25c200b51d15" containerName="ceilometer-notification-agent" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.706078 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec7dda1-a8ec-4aa6-a3be-25c200b51d15" containerName="ceilometer-notification-agent" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.706264 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec7dda1-a8ec-4aa6-a3be-25c200b51d15" containerName="proxy-httpd" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.706292 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec7dda1-a8ec-4aa6-a3be-25c200b51d15" containerName="sg-core" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.706303 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec7dda1-a8ec-4aa6-a3be-25c200b51d15" containerName="ceilometer-notification-agent" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.708679 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.712557 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.712810 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.714408 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.761960 4825 scope.go:117] "RemoveContainer" containerID="20c68df7466884e12ae6674761a3ac7f6ed8c81c47c4ea3622c52412feb93e23" Oct 07 19:19:12 crc kubenswrapper[4825]: E1007 19:19:12.762354 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20c68df7466884e12ae6674761a3ac7f6ed8c81c47c4ea3622c52412feb93e23\": container with ID starting with 20c68df7466884e12ae6674761a3ac7f6ed8c81c47c4ea3622c52412feb93e23 not found: ID does not exist" containerID="20c68df7466884e12ae6674761a3ac7f6ed8c81c47c4ea3622c52412feb93e23" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.762383 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20c68df7466884e12ae6674761a3ac7f6ed8c81c47c4ea3622c52412feb93e23"} err="failed to get container status \"20c68df7466884e12ae6674761a3ac7f6ed8c81c47c4ea3622c52412feb93e23\": rpc error: code = NotFound desc = could not find container \"20c68df7466884e12ae6674761a3ac7f6ed8c81c47c4ea3622c52412feb93e23\": container with ID starting with 20c68df7466884e12ae6674761a3ac7f6ed8c81c47c4ea3622c52412feb93e23 not found: ID does not exist" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.762405 4825 scope.go:117] "RemoveContainer" containerID="112c82a4d6940b91671a4ea3bc2dfa34faa8090b62c0f942517318ecf5fcd0a1" Oct 07 19:19:12 crc kubenswrapper[4825]: E1007 19:19:12.762649 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"112c82a4d6940b91671a4ea3bc2dfa34faa8090b62c0f942517318ecf5fcd0a1\": container with ID starting with 112c82a4d6940b91671a4ea3bc2dfa34faa8090b62c0f942517318ecf5fcd0a1 not found: ID does not exist" containerID="112c82a4d6940b91671a4ea3bc2dfa34faa8090b62c0f942517318ecf5fcd0a1" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.762676 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"112c82a4d6940b91671a4ea3bc2dfa34faa8090b62c0f942517318ecf5fcd0a1"} err="failed to get container status \"112c82a4d6940b91671a4ea3bc2dfa34faa8090b62c0f942517318ecf5fcd0a1\": rpc error: code = NotFound desc = could not find container \"112c82a4d6940b91671a4ea3bc2dfa34faa8090b62c0f942517318ecf5fcd0a1\": container with ID starting with 112c82a4d6940b91671a4ea3bc2dfa34faa8090b62c0f942517318ecf5fcd0a1 not found: ID does not exist" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.762691 4825 scope.go:117] "RemoveContainer" containerID="ab1230628bfa9906331a75a996428b1e675abe1e75bda8d046a0a80ebd77346f" Oct 07 19:19:12 crc kubenswrapper[4825]: E1007 19:19:12.762871 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab1230628bfa9906331a75a996428b1e675abe1e75bda8d046a0a80ebd77346f\": container with ID starting with ab1230628bfa9906331a75a996428b1e675abe1e75bda8d046a0a80ebd77346f not found: ID does not exist" containerID="ab1230628bfa9906331a75a996428b1e675abe1e75bda8d046a0a80ebd77346f" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.762889 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab1230628bfa9906331a75a996428b1e675abe1e75bda8d046a0a80ebd77346f"} err="failed to get container status \"ab1230628bfa9906331a75a996428b1e675abe1e75bda8d046a0a80ebd77346f\": rpc error: code = NotFound desc = could not find container \"ab1230628bfa9906331a75a996428b1e675abe1e75bda8d046a0a80ebd77346f\": container with ID starting with ab1230628bfa9906331a75a996428b1e675abe1e75bda8d046a0a80ebd77346f not found: ID does not exist" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.815530 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f475d4d9-9da4-48b3-a999-0b53d1ef346c-config-data\") pod \"ceilometer-0\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " pod="openstack/ceilometer-0" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.816182 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f475d4d9-9da4-48b3-a999-0b53d1ef346c-scripts\") pod \"ceilometer-0\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " pod="openstack/ceilometer-0" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.816220 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f475d4d9-9da4-48b3-a999-0b53d1ef346c-log-httpd\") pod \"ceilometer-0\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " pod="openstack/ceilometer-0" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.816264 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f475d4d9-9da4-48b3-a999-0b53d1ef346c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " pod="openstack/ceilometer-0" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.816287 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f475d4d9-9da4-48b3-a999-0b53d1ef346c-run-httpd\") pod \"ceilometer-0\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " pod="openstack/ceilometer-0" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.816324 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4kzb\" (UniqueName: \"kubernetes.io/projected/f475d4d9-9da4-48b3-a999-0b53d1ef346c-kube-api-access-v4kzb\") pod \"ceilometer-0\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " pod="openstack/ceilometer-0" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.816392 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f475d4d9-9da4-48b3-a999-0b53d1ef346c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " pod="openstack/ceilometer-0" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.918133 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f475d4d9-9da4-48b3-a999-0b53d1ef346c-config-data\") pod \"ceilometer-0\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " pod="openstack/ceilometer-0" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.918283 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f475d4d9-9da4-48b3-a999-0b53d1ef346c-scripts\") pod \"ceilometer-0\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " pod="openstack/ceilometer-0" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.918303 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f475d4d9-9da4-48b3-a999-0b53d1ef346c-log-httpd\") pod \"ceilometer-0\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " pod="openstack/ceilometer-0" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.918327 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f475d4d9-9da4-48b3-a999-0b53d1ef346c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " pod="openstack/ceilometer-0" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.918348 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f475d4d9-9da4-48b3-a999-0b53d1ef346c-run-httpd\") pod \"ceilometer-0\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " pod="openstack/ceilometer-0" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.918367 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4kzb\" (UniqueName: \"kubernetes.io/projected/f475d4d9-9da4-48b3-a999-0b53d1ef346c-kube-api-access-v4kzb\") pod \"ceilometer-0\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " pod="openstack/ceilometer-0" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.918435 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f475d4d9-9da4-48b3-a999-0b53d1ef346c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " pod="openstack/ceilometer-0" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.918946 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f475d4d9-9da4-48b3-a999-0b53d1ef346c-run-httpd\") pod \"ceilometer-0\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " pod="openstack/ceilometer-0" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.919445 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f475d4d9-9da4-48b3-a999-0b53d1ef346c-log-httpd\") pod \"ceilometer-0\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " pod="openstack/ceilometer-0" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.924676 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f475d4d9-9da4-48b3-a999-0b53d1ef346c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " pod="openstack/ceilometer-0" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.925063 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f475d4d9-9da4-48b3-a999-0b53d1ef346c-config-data\") pod \"ceilometer-0\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " pod="openstack/ceilometer-0" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.925707 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f475d4d9-9da4-48b3-a999-0b53d1ef346c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " pod="openstack/ceilometer-0" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.926631 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f475d4d9-9da4-48b3-a999-0b53d1ef346c-scripts\") pod \"ceilometer-0\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " pod="openstack/ceilometer-0" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.933971 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4kzb\" (UniqueName: \"kubernetes.io/projected/f475d4d9-9da4-48b3-a999-0b53d1ef346c-kube-api-access-v4kzb\") pod \"ceilometer-0\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " pod="openstack/ceilometer-0" Oct 07 19:19:12 crc kubenswrapper[4825]: I1007 19:19:12.964724 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:19:13 crc kubenswrapper[4825]: I1007 19:19:13.032827 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 19:19:13 crc kubenswrapper[4825]: I1007 19:19:13.040209 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:19:13 crc kubenswrapper[4825]: I1007 19:19:13.531957 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:19:13 crc kubenswrapper[4825]: I1007 19:19:13.643619 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f475d4d9-9da4-48b3-a999-0b53d1ef346c","Type":"ContainerStarted","Data":"10427bd48ec7d7dfea3171e0a1ffee9d926bbb4ffde055dcf5e657db3d47c6b8"} Oct 07 19:19:13 crc kubenswrapper[4825]: I1007 19:19:13.650437 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"660c15b2-9ce0-4ddf-9a41-4a4cc953972d","Type":"ContainerStarted","Data":"ebaa616cf731a27f980a47819aeb87f9b0768594813120378c8ef26fb8f688e0"} Oct 07 19:19:13 crc kubenswrapper[4825]: I1007 19:19:13.650488 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"660c15b2-9ce0-4ddf-9a41-4a4cc953972d","Type":"ContainerStarted","Data":"f0d527bdf3e5074ab3524cfe956db82de0ca401dd197a649a9e6cbe95f228cf6"} Oct 07 19:19:13 crc kubenswrapper[4825]: I1007 19:19:13.655743 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="16a360b8-34a4-46c9-843d-1e16eb594b69" containerName="cinder-api-log" containerID="cri-o://f657c62e1dbe0affe289c281eec58a1274ac34dd86611f564b798a21ce2797e6" gracePeriod=30 Oct 07 19:19:13 crc kubenswrapper[4825]: I1007 19:19:13.655925 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16a360b8-34a4-46c9-843d-1e16eb594b69","Type":"ContainerStarted","Data":"020cb57b84ea37ed0cd3399149daec9d3592faa4c6e00ad57afdf37320d8f507"} Oct 07 19:19:13 crc kubenswrapper[4825]: I1007 19:19:13.655964 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 07 19:19:13 crc kubenswrapper[4825]: I1007 19:19:13.655991 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="16a360b8-34a4-46c9-843d-1e16eb594b69" containerName="cinder-api" containerID="cri-o://020cb57b84ea37ed0cd3399149daec9d3592faa4c6e00ad57afdf37320d8f507" gracePeriod=30 Oct 07 19:19:13 crc kubenswrapper[4825]: I1007 19:19:13.685547 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.7385492940000002 podStartE2EDuration="4.685525985s" podCreationTimestamp="2025-10-07 19:19:09 +0000 UTC" firstStartedPulling="2025-10-07 19:19:10.581682182 +0000 UTC m=+1139.403720819" lastFinishedPulling="2025-10-07 19:19:11.528658883 +0000 UTC m=+1140.350697510" observedRunningTime="2025-10-07 19:19:13.671893012 +0000 UTC m=+1142.493931649" watchObservedRunningTime="2025-10-07 19:19:13.685525985 +0000 UTC m=+1142.507564642" Oct 07 19:19:13 crc kubenswrapper[4825]: I1007 19:19:13.699587 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.6995694310000005 podStartE2EDuration="4.699569431s" podCreationTimestamp="2025-10-07 19:19:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:19:13.691553246 +0000 UTC m=+1142.513591883" watchObservedRunningTime="2025-10-07 19:19:13.699569431 +0000 UTC m=+1142.521608068" Oct 07 19:19:13 crc kubenswrapper[4825]: I1007 19:19:13.810098 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec7dda1-a8ec-4aa6-a3be-25c200b51d15" path="/var/lib/kubelet/pods/5ec7dda1-a8ec-4aa6-a3be-25c200b51d15/volumes" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.187459 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.243787 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16a360b8-34a4-46c9-843d-1e16eb594b69-etc-machine-id\") pod \"16a360b8-34a4-46c9-843d-1e16eb594b69\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.243854 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a360b8-34a4-46c9-843d-1e16eb594b69-combined-ca-bundle\") pod \"16a360b8-34a4-46c9-843d-1e16eb594b69\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.244030 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16a360b8-34a4-46c9-843d-1e16eb594b69-config-data-custom\") pod \"16a360b8-34a4-46c9-843d-1e16eb594b69\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.244049 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a360b8-34a4-46c9-843d-1e16eb594b69-config-data\") pod \"16a360b8-34a4-46c9-843d-1e16eb594b69\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.244072 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8k5w\" (UniqueName: \"kubernetes.io/projected/16a360b8-34a4-46c9-843d-1e16eb594b69-kube-api-access-p8k5w\") pod \"16a360b8-34a4-46c9-843d-1e16eb594b69\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.244133 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16a360b8-34a4-46c9-843d-1e16eb594b69-logs\") pod \"16a360b8-34a4-46c9-843d-1e16eb594b69\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.244167 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16a360b8-34a4-46c9-843d-1e16eb594b69-scripts\") pod \"16a360b8-34a4-46c9-843d-1e16eb594b69\" (UID: \"16a360b8-34a4-46c9-843d-1e16eb594b69\") " Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.252715 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16a360b8-34a4-46c9-843d-1e16eb594b69-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "16a360b8-34a4-46c9-843d-1e16eb594b69" (UID: "16a360b8-34a4-46c9-843d-1e16eb594b69"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.255433 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16a360b8-34a4-46c9-843d-1e16eb594b69-logs" (OuterVolumeSpecName: "logs") pod "16a360b8-34a4-46c9-843d-1e16eb594b69" (UID: "16a360b8-34a4-46c9-843d-1e16eb594b69"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.256568 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16a360b8-34a4-46c9-843d-1e16eb594b69-kube-api-access-p8k5w" (OuterVolumeSpecName: "kube-api-access-p8k5w") pod "16a360b8-34a4-46c9-843d-1e16eb594b69" (UID: "16a360b8-34a4-46c9-843d-1e16eb594b69"). InnerVolumeSpecName "kube-api-access-p8k5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.265391 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a360b8-34a4-46c9-843d-1e16eb594b69-scripts" (OuterVolumeSpecName: "scripts") pod "16a360b8-34a4-46c9-843d-1e16eb594b69" (UID: "16a360b8-34a4-46c9-843d-1e16eb594b69"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.273397 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a360b8-34a4-46c9-843d-1e16eb594b69-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "16a360b8-34a4-46c9-843d-1e16eb594b69" (UID: "16a360b8-34a4-46c9-843d-1e16eb594b69"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.285449 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a360b8-34a4-46c9-843d-1e16eb594b69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16a360b8-34a4-46c9-843d-1e16eb594b69" (UID: "16a360b8-34a4-46c9-843d-1e16eb594b69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.303302 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-578479f95d-ht4w8" podUID="6db17e9a-ec51-4c59-ad0d-0835b90c8231" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:36932->10.217.0.163:9311: read: connection reset by peer" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.303527 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-578479f95d-ht4w8" podUID="6db17e9a-ec51-4c59-ad0d-0835b90c8231" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:36928->10.217.0.163:9311: read: connection reset by peer" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.331768 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a360b8-34a4-46c9-843d-1e16eb594b69-config-data" (OuterVolumeSpecName: "config-data") pod "16a360b8-34a4-46c9-843d-1e16eb594b69" (UID: "16a360b8-34a4-46c9-843d-1e16eb594b69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.347838 4825 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16a360b8-34a4-46c9-843d-1e16eb594b69-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.347885 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a360b8-34a4-46c9-843d-1e16eb594b69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.347895 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a360b8-34a4-46c9-843d-1e16eb594b69-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.347903 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16a360b8-34a4-46c9-843d-1e16eb594b69-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.347912 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8k5w\" (UniqueName: \"kubernetes.io/projected/16a360b8-34a4-46c9-843d-1e16eb594b69-kube-api-access-p8k5w\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.347922 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16a360b8-34a4-46c9-843d-1e16eb594b69-logs\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.347946 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16a360b8-34a4-46c9-843d-1e16eb594b69-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.665838 4825 generic.go:334] "Generic (PLEG): container finished" podID="16a360b8-34a4-46c9-843d-1e16eb594b69" containerID="020cb57b84ea37ed0cd3399149daec9d3592faa4c6e00ad57afdf37320d8f507" exitCode=0 Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.666118 4825 generic.go:334] "Generic (PLEG): container finished" podID="16a360b8-34a4-46c9-843d-1e16eb594b69" containerID="f657c62e1dbe0affe289c281eec58a1274ac34dd86611f564b798a21ce2797e6" exitCode=143 Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.666158 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16a360b8-34a4-46c9-843d-1e16eb594b69","Type":"ContainerDied","Data":"020cb57b84ea37ed0cd3399149daec9d3592faa4c6e00ad57afdf37320d8f507"} Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.666200 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16a360b8-34a4-46c9-843d-1e16eb594b69","Type":"ContainerDied","Data":"f657c62e1dbe0affe289c281eec58a1274ac34dd86611f564b798a21ce2797e6"} Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.666210 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16a360b8-34a4-46c9-843d-1e16eb594b69","Type":"ContainerDied","Data":"0a59c30171b478bb7a161e6248175d9620376537092256045f0d8198e7ad3043"} Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.666242 4825 scope.go:117] "RemoveContainer" containerID="020cb57b84ea37ed0cd3399149daec9d3592faa4c6e00ad57afdf37320d8f507" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.666364 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.670547 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f475d4d9-9da4-48b3-a999-0b53d1ef346c","Type":"ContainerStarted","Data":"6bd2699dce7647575a45a710a982173a379b5c0e9d45437acbbda5c7a4d62717"} Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.673881 4825 generic.go:334] "Generic (PLEG): container finished" podID="6db17e9a-ec51-4c59-ad0d-0835b90c8231" containerID="6cc2d532cfcc18c8cb9531818e4d3f565130eb88bb11eebcd3d82a23f350d9bf" exitCode=0 Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.674875 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-578479f95d-ht4w8" event={"ID":"6db17e9a-ec51-4c59-ad0d-0835b90c8231","Type":"ContainerDied","Data":"6cc2d532cfcc18c8cb9531818e4d3f565130eb88bb11eebcd3d82a23f350d9bf"} Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.674913 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-578479f95d-ht4w8" event={"ID":"6db17e9a-ec51-4c59-ad0d-0835b90c8231","Type":"ContainerDied","Data":"c77739ccf3876013f8877d9d5d3a501349ffcfb5fc6c5b55ab84b235aa541c03"} Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.674926 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c77739ccf3876013f8877d9d5d3a501349ffcfb5fc6c5b55ab84b235aa541c03" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.686928 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-578479f95d-ht4w8" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.712136 4825 scope.go:117] "RemoveContainer" containerID="f657c62e1dbe0affe289c281eec58a1274ac34dd86611f564b798a21ce2797e6" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.712291 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.731302 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.755838 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmb6c\" (UniqueName: \"kubernetes.io/projected/6db17e9a-ec51-4c59-ad0d-0835b90c8231-kube-api-access-tmb6c\") pod \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\" (UID: \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\") " Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.755892 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6db17e9a-ec51-4c59-ad0d-0835b90c8231-logs\") pod \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\" (UID: \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\") " Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.755922 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db17e9a-ec51-4c59-ad0d-0835b90c8231-combined-ca-bundle\") pod \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\" (UID: \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\") " Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.756008 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db17e9a-ec51-4c59-ad0d-0835b90c8231-config-data\") pod \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\" (UID: \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\") " Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.756085 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6db17e9a-ec51-4c59-ad0d-0835b90c8231-config-data-custom\") pod \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\" (UID: \"6db17e9a-ec51-4c59-ad0d-0835b90c8231\") " Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.760673 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6db17e9a-ec51-4c59-ad0d-0835b90c8231-logs" (OuterVolumeSpecName: "logs") pod "6db17e9a-ec51-4c59-ad0d-0835b90c8231" (UID: "6db17e9a-ec51-4c59-ad0d-0835b90c8231"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.762853 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6db17e9a-ec51-4c59-ad0d-0835b90c8231-kube-api-access-tmb6c" (OuterVolumeSpecName: "kube-api-access-tmb6c") pod "6db17e9a-ec51-4c59-ad0d-0835b90c8231" (UID: "6db17e9a-ec51-4c59-ad0d-0835b90c8231"). InnerVolumeSpecName "kube-api-access-tmb6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.769533 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db17e9a-ec51-4c59-ad0d-0835b90c8231-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6db17e9a-ec51-4c59-ad0d-0835b90c8231" (UID: "6db17e9a-ec51-4c59-ad0d-0835b90c8231"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.776679 4825 scope.go:117] "RemoveContainer" containerID="020cb57b84ea37ed0cd3399149daec9d3592faa4c6e00ad57afdf37320d8f507" Oct 07 19:19:14 crc kubenswrapper[4825]: E1007 19:19:14.778140 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"020cb57b84ea37ed0cd3399149daec9d3592faa4c6e00ad57afdf37320d8f507\": container with ID starting with 020cb57b84ea37ed0cd3399149daec9d3592faa4c6e00ad57afdf37320d8f507 not found: ID does not exist" containerID="020cb57b84ea37ed0cd3399149daec9d3592faa4c6e00ad57afdf37320d8f507" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.778186 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"020cb57b84ea37ed0cd3399149daec9d3592faa4c6e00ad57afdf37320d8f507"} err="failed to get container status \"020cb57b84ea37ed0cd3399149daec9d3592faa4c6e00ad57afdf37320d8f507\": rpc error: code = NotFound desc = could not find container \"020cb57b84ea37ed0cd3399149daec9d3592faa4c6e00ad57afdf37320d8f507\": container with ID starting with 020cb57b84ea37ed0cd3399149daec9d3592faa4c6e00ad57afdf37320d8f507 not found: ID does not exist" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.778211 4825 scope.go:117] "RemoveContainer" containerID="f657c62e1dbe0affe289c281eec58a1274ac34dd86611f564b798a21ce2797e6" Oct 07 19:19:14 crc kubenswrapper[4825]: E1007 19:19:14.778578 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f657c62e1dbe0affe289c281eec58a1274ac34dd86611f564b798a21ce2797e6\": container with ID starting with f657c62e1dbe0affe289c281eec58a1274ac34dd86611f564b798a21ce2797e6 not found: ID does not exist" containerID="f657c62e1dbe0affe289c281eec58a1274ac34dd86611f564b798a21ce2797e6" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.778599 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f657c62e1dbe0affe289c281eec58a1274ac34dd86611f564b798a21ce2797e6"} err="failed to get container status \"f657c62e1dbe0affe289c281eec58a1274ac34dd86611f564b798a21ce2797e6\": rpc error: code = NotFound desc = could not find container \"f657c62e1dbe0affe289c281eec58a1274ac34dd86611f564b798a21ce2797e6\": container with ID starting with f657c62e1dbe0affe289c281eec58a1274ac34dd86611f564b798a21ce2797e6 not found: ID does not exist" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.778636 4825 scope.go:117] "RemoveContainer" containerID="020cb57b84ea37ed0cd3399149daec9d3592faa4c6e00ad57afdf37320d8f507" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.779006 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"020cb57b84ea37ed0cd3399149daec9d3592faa4c6e00ad57afdf37320d8f507"} err="failed to get container status \"020cb57b84ea37ed0cd3399149daec9d3592faa4c6e00ad57afdf37320d8f507\": rpc error: code = NotFound desc = could not find container \"020cb57b84ea37ed0cd3399149daec9d3592faa4c6e00ad57afdf37320d8f507\": container with ID starting with 020cb57b84ea37ed0cd3399149daec9d3592faa4c6e00ad57afdf37320d8f507 not found: ID does not exist" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.779051 4825 scope.go:117] "RemoveContainer" containerID="f657c62e1dbe0affe289c281eec58a1274ac34dd86611f564b798a21ce2797e6" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.779275 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f657c62e1dbe0affe289c281eec58a1274ac34dd86611f564b798a21ce2797e6"} err="failed to get container status \"f657c62e1dbe0affe289c281eec58a1274ac34dd86611f564b798a21ce2797e6\": rpc error: code = NotFound desc = could not find container \"f657c62e1dbe0affe289c281eec58a1274ac34dd86611f564b798a21ce2797e6\": container with ID starting with f657c62e1dbe0affe289c281eec58a1274ac34dd86611f564b798a21ce2797e6 not found: ID does not exist" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.787475 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 07 19:19:14 crc kubenswrapper[4825]: E1007 19:19:14.788012 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6db17e9a-ec51-4c59-ad0d-0835b90c8231" containerName="barbican-api" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.788038 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db17e9a-ec51-4c59-ad0d-0835b90c8231" containerName="barbican-api" Oct 07 19:19:14 crc kubenswrapper[4825]: E1007 19:19:14.788059 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a360b8-34a4-46c9-843d-1e16eb594b69" containerName="cinder-api-log" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.788072 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a360b8-34a4-46c9-843d-1e16eb594b69" containerName="cinder-api-log" Oct 07 19:19:14 crc kubenswrapper[4825]: E1007 19:19:14.788085 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a360b8-34a4-46c9-843d-1e16eb594b69" containerName="cinder-api" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.788093 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a360b8-34a4-46c9-843d-1e16eb594b69" containerName="cinder-api" Oct 07 19:19:14 crc kubenswrapper[4825]: E1007 19:19:14.788108 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6db17e9a-ec51-4c59-ad0d-0835b90c8231" containerName="barbican-api-log" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.788115 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db17e9a-ec51-4c59-ad0d-0835b90c8231" containerName="barbican-api-log" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.788374 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="6db17e9a-ec51-4c59-ad0d-0835b90c8231" containerName="barbican-api" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.788417 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="16a360b8-34a4-46c9-843d-1e16eb594b69" containerName="cinder-api-log" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.788432 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="16a360b8-34a4-46c9-843d-1e16eb594b69" containerName="cinder-api" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.788447 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="6db17e9a-ec51-4c59-ad0d-0835b90c8231" containerName="barbican-api-log" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.789798 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.792684 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.793106 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.793650 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.803162 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.813091 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db17e9a-ec51-4c59-ad0d-0835b90c8231-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6db17e9a-ec51-4c59-ad0d-0835b90c8231" (UID: "6db17e9a-ec51-4c59-ad0d-0835b90c8231"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.814215 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db17e9a-ec51-4c59-ad0d-0835b90c8231-config-data" (OuterVolumeSpecName: "config-data") pod "6db17e9a-ec51-4c59-ad0d-0835b90c8231" (UID: "6db17e9a-ec51-4c59-ad0d-0835b90c8231"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.858350 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598ea581-8e2b-47f6-8360-3907ab4c3f49-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.858986 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/598ea581-8e2b-47f6-8360-3907ab4c3f49-config-data-custom\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.859109 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24nfg\" (UniqueName: \"kubernetes.io/projected/598ea581-8e2b-47f6-8360-3907ab4c3f49-kube-api-access-24nfg\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.859256 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/598ea581-8e2b-47f6-8360-3907ab4c3f49-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.859418 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/598ea581-8e2b-47f6-8360-3907ab4c3f49-public-tls-certs\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.859519 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/598ea581-8e2b-47f6-8360-3907ab4c3f49-scripts\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.859617 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/598ea581-8e2b-47f6-8360-3907ab4c3f49-etc-machine-id\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.859763 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/598ea581-8e2b-47f6-8360-3907ab4c3f49-logs\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.859973 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/598ea581-8e2b-47f6-8360-3907ab4c3f49-config-data\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.860138 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db17e9a-ec51-4c59-ad0d-0835b90c8231-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.860217 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6db17e9a-ec51-4c59-ad0d-0835b90c8231-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.860314 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmb6c\" (UniqueName: \"kubernetes.io/projected/6db17e9a-ec51-4c59-ad0d-0835b90c8231-kube-api-access-tmb6c\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.860385 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6db17e9a-ec51-4c59-ad0d-0835b90c8231-logs\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.860466 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db17e9a-ec51-4c59-ad0d-0835b90c8231-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.867787 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-58d7dd5b56-nhlgz" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.931997 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.932510 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-566c6c8d88-h74t9"] Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.963555 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/598ea581-8e2b-47f6-8360-3907ab4c3f49-config-data\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.963659 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598ea581-8e2b-47f6-8360-3907ab4c3f49-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.963687 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/598ea581-8e2b-47f6-8360-3907ab4c3f49-config-data-custom\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.963718 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24nfg\" (UniqueName: \"kubernetes.io/projected/598ea581-8e2b-47f6-8360-3907ab4c3f49-kube-api-access-24nfg\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.963736 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/598ea581-8e2b-47f6-8360-3907ab4c3f49-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.964686 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/598ea581-8e2b-47f6-8360-3907ab4c3f49-public-tls-certs\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.964727 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/598ea581-8e2b-47f6-8360-3907ab4c3f49-scripts\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.964752 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/598ea581-8e2b-47f6-8360-3907ab4c3f49-etc-machine-id\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.964839 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/598ea581-8e2b-47f6-8360-3907ab4c3f49-logs\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.966482 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/598ea581-8e2b-47f6-8360-3907ab4c3f49-logs\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.973312 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/598ea581-8e2b-47f6-8360-3907ab4c3f49-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.973398 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/598ea581-8e2b-47f6-8360-3907ab4c3f49-etc-machine-id\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.973938 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/598ea581-8e2b-47f6-8360-3907ab4c3f49-config-data-custom\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.974100 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/598ea581-8e2b-47f6-8360-3907ab4c3f49-public-tls-certs\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.974171 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/598ea581-8e2b-47f6-8360-3907ab4c3f49-config-data\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.975985 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598ea581-8e2b-47f6-8360-3907ab4c3f49-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.976825 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/598ea581-8e2b-47f6-8360-3907ab4c3f49-scripts\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:14 crc kubenswrapper[4825]: I1007 19:19:14.991030 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24nfg\" (UniqueName: \"kubernetes.io/projected/598ea581-8e2b-47f6-8360-3907ab4c3f49-kube-api-access-24nfg\") pod \"cinder-api-0\" (UID: \"598ea581-8e2b-47f6-8360-3907ab4c3f49\") " pod="openstack/cinder-api-0" Oct 07 19:19:15 crc kubenswrapper[4825]: I1007 19:19:15.073867 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 07 19:19:15 crc kubenswrapper[4825]: I1007 19:19:15.118084 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 19:19:15 crc kubenswrapper[4825]: I1007 19:19:15.610959 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 19:19:15 crc kubenswrapper[4825]: I1007 19:19:15.703326 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"598ea581-8e2b-47f6-8360-3907ab4c3f49","Type":"ContainerStarted","Data":"c19c21965e1927a6334264759ee2fc6290c979436adbe77c5ffb7294c10bb0c9"} Oct 07 19:19:15 crc kubenswrapper[4825]: I1007 19:19:15.710190 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f475d4d9-9da4-48b3-a999-0b53d1ef346c","Type":"ContainerStarted","Data":"aa5a0553e86641282d137d798de16e1b9de03cf1d45e9d0fb725517914e54730"} Oct 07 19:19:15 crc kubenswrapper[4825]: I1007 19:19:15.710275 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-578479f95d-ht4w8" Oct 07 19:19:15 crc kubenswrapper[4825]: I1007 19:19:15.711247 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-566c6c8d88-h74t9" podUID="b66fe3a9-9849-4219-badb-a0cecbb2a388" containerName="horizon-log" containerID="cri-o://aea12da3989f209c9f5f62775f725a8aaf79a64c22af740047e00c937865a840" gracePeriod=30 Oct 07 19:19:15 crc kubenswrapper[4825]: I1007 19:19:15.711282 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-566c6c8d88-h74t9" podUID="b66fe3a9-9849-4219-badb-a0cecbb2a388" containerName="horizon" containerID="cri-o://b617ed135554f80f64aa57a58a8e8eeb06038dec9d1d88a161cbbda0bc1b3b20" gracePeriod=30 Oct 07 19:19:15 crc kubenswrapper[4825]: I1007 19:19:15.751433 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-578479f95d-ht4w8"] Oct 07 19:19:15 crc kubenswrapper[4825]: I1007 19:19:15.756479 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-578479f95d-ht4w8"] Oct 07 19:19:15 crc kubenswrapper[4825]: I1007 19:19:15.806813 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16a360b8-34a4-46c9-843d-1e16eb594b69" path="/var/lib/kubelet/pods/16a360b8-34a4-46c9-843d-1e16eb594b69/volumes" Oct 07 19:19:15 crc kubenswrapper[4825]: I1007 19:19:15.808103 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6db17e9a-ec51-4c59-ad0d-0835b90c8231" path="/var/lib/kubelet/pods/6db17e9a-ec51-4c59-ad0d-0835b90c8231/volumes" Oct 07 19:19:16 crc kubenswrapper[4825]: I1007 19:19:16.731092 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"598ea581-8e2b-47f6-8360-3907ab4c3f49","Type":"ContainerStarted","Data":"a687e0caca26b2d0099b182f612d8a8979c31c1ae3fe4ffa49f10ef624a76a38"} Oct 07 19:19:16 crc kubenswrapper[4825]: I1007 19:19:16.734143 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f475d4d9-9da4-48b3-a999-0b53d1ef346c","Type":"ContainerStarted","Data":"82c06c1fb3a3b098e32195151b334fdb1ca79a78466d65cda9c856f55c6238b1"} Oct 07 19:19:17 crc kubenswrapper[4825]: I1007 19:19:17.750332 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f475d4d9-9da4-48b3-a999-0b53d1ef346c","Type":"ContainerStarted","Data":"80ab38ef827c51547550368942fb145f1972d84d6827c922550de0645a815544"} Oct 07 19:19:17 crc kubenswrapper[4825]: I1007 19:19:17.751150 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 19:19:17 crc kubenswrapper[4825]: I1007 19:19:17.754335 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"598ea581-8e2b-47f6-8360-3907ab4c3f49","Type":"ContainerStarted","Data":"5d3ac4df74ac2a8285d85cc0e51030b4e839fed4ff5249100cbb464b6184c974"} Oct 07 19:19:17 crc kubenswrapper[4825]: I1007 19:19:17.754553 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 07 19:19:17 crc kubenswrapper[4825]: I1007 19:19:17.775100 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.839133143 podStartE2EDuration="5.775084969s" podCreationTimestamp="2025-10-07 19:19:12 +0000 UTC" firstStartedPulling="2025-10-07 19:19:13.534608333 +0000 UTC m=+1142.356646970" lastFinishedPulling="2025-10-07 19:19:17.470560159 +0000 UTC m=+1146.292598796" observedRunningTime="2025-10-07 19:19:17.769215603 +0000 UTC m=+1146.591254240" watchObservedRunningTime="2025-10-07 19:19:17.775084969 +0000 UTC m=+1146.597123596" Oct 07 19:19:19 crc kubenswrapper[4825]: I1007 19:19:19.783210 4825 generic.go:334] "Generic (PLEG): container finished" podID="b66fe3a9-9849-4219-badb-a0cecbb2a388" containerID="b617ed135554f80f64aa57a58a8e8eeb06038dec9d1d88a161cbbda0bc1b3b20" exitCode=0 Oct 07 19:19:19 crc kubenswrapper[4825]: I1007 19:19:19.783331 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-566c6c8d88-h74t9" event={"ID":"b66fe3a9-9849-4219-badb-a0cecbb2a388","Type":"ContainerDied","Data":"b617ed135554f80f64aa57a58a8e8eeb06038dec9d1d88a161cbbda0bc1b3b20"} Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.161222 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.197696 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.197674209 podStartE2EDuration="6.197674209s" podCreationTimestamp="2025-10-07 19:19:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:19:17.81286767 +0000 UTC m=+1146.634906317" watchObservedRunningTime="2025-10-07 19:19:20.197674209 +0000 UTC m=+1149.019712866" Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.242374 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-xfgfj"] Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.242777 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" podUID="ce0b553c-e9d5-4613-a202-65fc185f60b4" containerName="dnsmasq-dns" containerID="cri-o://cafba9783489bfa5a2752f102a3716c76caae423dd52d24982badba596f37317" gracePeriod=10 Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.395577 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.416699 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-566c6c8d88-h74t9" podUID="b66fe3a9-9849-4219-badb-a0cecbb2a388" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.434777 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.644260 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-d67bd544-4s2q8" Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.744431 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.779441 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-dns-swift-storage-0\") pod \"ce0b553c-e9d5-4613-a202-65fc185f60b4\" (UID: \"ce0b553c-e9d5-4613-a202-65fc185f60b4\") " Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.779580 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-ovsdbserver-nb\") pod \"ce0b553c-e9d5-4613-a202-65fc185f60b4\" (UID: \"ce0b553c-e9d5-4613-a202-65fc185f60b4\") " Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.779715 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-config\") pod \"ce0b553c-e9d5-4613-a202-65fc185f60b4\" (UID: \"ce0b553c-e9d5-4613-a202-65fc185f60b4\") " Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.779771 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wzxf\" (UniqueName: \"kubernetes.io/projected/ce0b553c-e9d5-4613-a202-65fc185f60b4-kube-api-access-2wzxf\") pod \"ce0b553c-e9d5-4613-a202-65fc185f60b4\" (UID: \"ce0b553c-e9d5-4613-a202-65fc185f60b4\") " Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.779834 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-ovsdbserver-sb\") pod \"ce0b553c-e9d5-4613-a202-65fc185f60b4\" (UID: \"ce0b553c-e9d5-4613-a202-65fc185f60b4\") " Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.779913 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-dns-svc\") pod \"ce0b553c-e9d5-4613-a202-65fc185f60b4\" (UID: \"ce0b553c-e9d5-4613-a202-65fc185f60b4\") " Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.793421 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce0b553c-e9d5-4613-a202-65fc185f60b4-kube-api-access-2wzxf" (OuterVolumeSpecName: "kube-api-access-2wzxf") pod "ce0b553c-e9d5-4613-a202-65fc185f60b4" (UID: "ce0b553c-e9d5-4613-a202-65fc185f60b4"). InnerVolumeSpecName "kube-api-access-2wzxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.815720 4825 generic.go:334] "Generic (PLEG): container finished" podID="ce0b553c-e9d5-4613-a202-65fc185f60b4" containerID="cafba9783489bfa5a2752f102a3716c76caae423dd52d24982badba596f37317" exitCode=0 Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.816018 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.816331 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="660c15b2-9ce0-4ddf-9a41-4a4cc953972d" containerName="cinder-scheduler" containerID="cri-o://f0d527bdf3e5074ab3524cfe956db82de0ca401dd197a649a9e6cbe95f228cf6" gracePeriod=30 Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.815906 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" event={"ID":"ce0b553c-e9d5-4613-a202-65fc185f60b4","Type":"ContainerDied","Data":"cafba9783489bfa5a2752f102a3716c76caae423dd52d24982badba596f37317"} Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.816638 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-xfgfj" event={"ID":"ce0b553c-e9d5-4613-a202-65fc185f60b4","Type":"ContainerDied","Data":"2d54a3abebec3d9990f0bf446b54246d618db3ba595eaefa9f0b27ee078b5fd0"} Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.816733 4825 scope.go:117] "RemoveContainer" containerID="cafba9783489bfa5a2752f102a3716c76caae423dd52d24982badba596f37317" Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.820577 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="660c15b2-9ce0-4ddf-9a41-4a4cc953972d" containerName="probe" containerID="cri-o://ebaa616cf731a27f980a47819aeb87f9b0768594813120378c8ef26fb8f688e0" gracePeriod=30 Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.869099 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ce0b553c-e9d5-4613-a202-65fc185f60b4" (UID: "ce0b553c-e9d5-4613-a202-65fc185f60b4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.880015 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce0b553c-e9d5-4613-a202-65fc185f60b4" (UID: "ce0b553c-e9d5-4613-a202-65fc185f60b4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.882470 4825 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.882491 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wzxf\" (UniqueName: \"kubernetes.io/projected/ce0b553c-e9d5-4613-a202-65fc185f60b4-kube-api-access-2wzxf\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.882503 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.896740 4825 scope.go:117] "RemoveContainer" containerID="9ebeaf10a1e81bb05f85ae36561a723f2651bcc0e24f9ede4a6352970c6608db" Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.911623 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-config" (OuterVolumeSpecName: "config") pod "ce0b553c-e9d5-4613-a202-65fc185f60b4" (UID: "ce0b553c-e9d5-4613-a202-65fc185f60b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.920621 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce0b553c-e9d5-4613-a202-65fc185f60b4" (UID: "ce0b553c-e9d5-4613-a202-65fc185f60b4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.928727 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce0b553c-e9d5-4613-a202-65fc185f60b4" (UID: "ce0b553c-e9d5-4613-a202-65fc185f60b4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.928796 4825 scope.go:117] "RemoveContainer" containerID="cafba9783489bfa5a2752f102a3716c76caae423dd52d24982badba596f37317" Oct 07 19:19:20 crc kubenswrapper[4825]: E1007 19:19:20.932391 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cafba9783489bfa5a2752f102a3716c76caae423dd52d24982badba596f37317\": container with ID starting with cafba9783489bfa5a2752f102a3716c76caae423dd52d24982badba596f37317 not found: ID does not exist" containerID="cafba9783489bfa5a2752f102a3716c76caae423dd52d24982badba596f37317" Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.932449 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cafba9783489bfa5a2752f102a3716c76caae423dd52d24982badba596f37317"} err="failed to get container status \"cafba9783489bfa5a2752f102a3716c76caae423dd52d24982badba596f37317\": rpc error: code = NotFound desc = could not find container \"cafba9783489bfa5a2752f102a3716c76caae423dd52d24982badba596f37317\": container with ID starting with cafba9783489bfa5a2752f102a3716c76caae423dd52d24982badba596f37317 not found: ID does not exist" Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.932494 4825 scope.go:117] "RemoveContainer" containerID="9ebeaf10a1e81bb05f85ae36561a723f2651bcc0e24f9ede4a6352970c6608db" Oct 07 19:19:20 crc kubenswrapper[4825]: E1007 19:19:20.936358 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ebeaf10a1e81bb05f85ae36561a723f2651bcc0e24f9ede4a6352970c6608db\": container with ID starting with 9ebeaf10a1e81bb05f85ae36561a723f2651bcc0e24f9ede4a6352970c6608db not found: ID does not exist" containerID="9ebeaf10a1e81bb05f85ae36561a723f2651bcc0e24f9ede4a6352970c6608db" Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.936642 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ebeaf10a1e81bb05f85ae36561a723f2651bcc0e24f9ede4a6352970c6608db"} err="failed to get container status \"9ebeaf10a1e81bb05f85ae36561a723f2651bcc0e24f9ede4a6352970c6608db\": rpc error: code = NotFound desc = could not find container \"9ebeaf10a1e81bb05f85ae36561a723f2651bcc0e24f9ede4a6352970c6608db\": container with ID starting with 9ebeaf10a1e81bb05f85ae36561a723f2651bcc0e24f9ede4a6352970c6608db not found: ID does not exist" Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.984745 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.984774 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:20 crc kubenswrapper[4825]: I1007 19:19:20.984783 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce0b553c-e9d5-4613-a202-65fc185f60b4-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:21 crc kubenswrapper[4825]: I1007 19:19:21.174914 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-xfgfj"] Oct 07 19:19:21 crc kubenswrapper[4825]: I1007 19:19:21.186521 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-xfgfj"] Oct 07 19:19:21 crc kubenswrapper[4825]: I1007 19:19:21.807067 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce0b553c-e9d5-4613-a202-65fc185f60b4" path="/var/lib/kubelet/pods/ce0b553c-e9d5-4613-a202-65fc185f60b4/volumes" Oct 07 19:19:21 crc kubenswrapper[4825]: I1007 19:19:21.831657 4825 generic.go:334] "Generic (PLEG): container finished" podID="660c15b2-9ce0-4ddf-9a41-4a4cc953972d" containerID="ebaa616cf731a27f980a47819aeb87f9b0768594813120378c8ef26fb8f688e0" exitCode=0 Oct 07 19:19:21 crc kubenswrapper[4825]: I1007 19:19:21.831748 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"660c15b2-9ce0-4ddf-9a41-4a4cc953972d","Type":"ContainerDied","Data":"ebaa616cf731a27f980a47819aeb87f9b0768594813120378c8ef26fb8f688e0"} Oct 07 19:19:22 crc kubenswrapper[4825]: I1007 19:19:22.706803 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7d47b47d5-hc6q5" Oct 07 19:19:22 crc kubenswrapper[4825]: I1007 19:19:22.817696 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d67bd544-4s2q8"] Oct 07 19:19:22 crc kubenswrapper[4825]: I1007 19:19:22.818053 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-d67bd544-4s2q8" podUID="7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e" containerName="neutron-api" containerID="cri-o://430b8417f7069c5ac78b65ba019fba33cf881d8bc2f3c7b4a4613012d49c9aab" gracePeriod=30 Oct 07 19:19:22 crc kubenswrapper[4825]: I1007 19:19:22.818555 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-d67bd544-4s2q8" podUID="7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e" containerName="neutron-httpd" containerID="cri-o://09ada7c56e9f60831da9c75e12cab1b4387bfb80b9cfe3fb9b464fa7d29f7d7b" gracePeriod=30 Oct 07 19:19:23 crc kubenswrapper[4825]: I1007 19:19:23.856181 4825 generic.go:334] "Generic (PLEG): container finished" podID="660c15b2-9ce0-4ddf-9a41-4a4cc953972d" containerID="f0d527bdf3e5074ab3524cfe956db82de0ca401dd197a649a9e6cbe95f228cf6" exitCode=0 Oct 07 19:19:23 crc kubenswrapper[4825]: I1007 19:19:23.856274 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"660c15b2-9ce0-4ddf-9a41-4a4cc953972d","Type":"ContainerDied","Data":"f0d527bdf3e5074ab3524cfe956db82de0ca401dd197a649a9e6cbe95f228cf6"} Oct 07 19:19:23 crc kubenswrapper[4825]: I1007 19:19:23.858552 4825 generic.go:334] "Generic (PLEG): container finished" podID="7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e" containerID="09ada7c56e9f60831da9c75e12cab1b4387bfb80b9cfe3fb9b464fa7d29f7d7b" exitCode=0 Oct 07 19:19:23 crc kubenswrapper[4825]: I1007 19:19:23.858602 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d67bd544-4s2q8" event={"ID":"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e","Type":"ContainerDied","Data":"09ada7c56e9f60831da9c75e12cab1b4387bfb80b9cfe3fb9b464fa7d29f7d7b"} Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.576981 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.757619 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-etc-machine-id\") pod \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\" (UID: \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\") " Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.757709 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs8lg\" (UniqueName: \"kubernetes.io/projected/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-kube-api-access-cs8lg\") pod \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\" (UID: \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\") " Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.757743 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-config-data-custom\") pod \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\" (UID: \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\") " Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.757744 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "660c15b2-9ce0-4ddf-9a41-4a4cc953972d" (UID: "660c15b2-9ce0-4ddf-9a41-4a4cc953972d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.757791 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-combined-ca-bundle\") pod \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\" (UID: \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\") " Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.757837 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-config-data\") pod \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\" (UID: \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\") " Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.757863 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-scripts\") pod \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\" (UID: \"660c15b2-9ce0-4ddf-9a41-4a4cc953972d\") " Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.758385 4825 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.765680 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-kube-api-access-cs8lg" (OuterVolumeSpecName: "kube-api-access-cs8lg") pod "660c15b2-9ce0-4ddf-9a41-4a4cc953972d" (UID: "660c15b2-9ce0-4ddf-9a41-4a4cc953972d"). InnerVolumeSpecName "kube-api-access-cs8lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.773318 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "660c15b2-9ce0-4ddf-9a41-4a4cc953972d" (UID: "660c15b2-9ce0-4ddf-9a41-4a4cc953972d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.773726 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-scripts" (OuterVolumeSpecName: "scripts") pod "660c15b2-9ce0-4ddf-9a41-4a4cc953972d" (UID: "660c15b2-9ce0-4ddf-9a41-4a4cc953972d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.846584 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "660c15b2-9ce0-4ddf-9a41-4a4cc953972d" (UID: "660c15b2-9ce0-4ddf-9a41-4a4cc953972d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.860817 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.860852 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs8lg\" (UniqueName: \"kubernetes.io/projected/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-kube-api-access-cs8lg\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.860868 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.860883 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.872444 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-config-data" (OuterVolumeSpecName: "config-data") pod "660c15b2-9ce0-4ddf-9a41-4a4cc953972d" (UID: "660c15b2-9ce0-4ddf-9a41-4a4cc953972d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.876756 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"660c15b2-9ce0-4ddf-9a41-4a4cc953972d","Type":"ContainerDied","Data":"c3c32700a8e55a4d14278b09a910c56bbc1ad77dd71d7cf990974b1d9bc78794"} Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.876827 4825 scope.go:117] "RemoveContainer" containerID="ebaa616cf731a27f980a47819aeb87f9b0768594813120378c8ef26fb8f688e0" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.877016 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.949245 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.949728 4825 scope.go:117] "RemoveContainer" containerID="f0d527bdf3e5074ab3524cfe956db82de0ca401dd197a649a9e6cbe95f228cf6" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.962202 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/660c15b2-9ce0-4ddf-9a41-4a4cc953972d-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.965727 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.973435 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 19:19:24 crc kubenswrapper[4825]: E1007 19:19:24.973788 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0b553c-e9d5-4613-a202-65fc185f60b4" containerName="init" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.973804 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0b553c-e9d5-4613-a202-65fc185f60b4" containerName="init" Oct 07 19:19:24 crc kubenswrapper[4825]: E1007 19:19:24.973849 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="660c15b2-9ce0-4ddf-9a41-4a4cc953972d" containerName="cinder-scheduler" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.973857 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="660c15b2-9ce0-4ddf-9a41-4a4cc953972d" containerName="cinder-scheduler" Oct 07 19:19:24 crc kubenswrapper[4825]: E1007 19:19:24.974007 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0b553c-e9d5-4613-a202-65fc185f60b4" containerName="dnsmasq-dns" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.974016 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0b553c-e9d5-4613-a202-65fc185f60b4" containerName="dnsmasq-dns" Oct 07 19:19:24 crc kubenswrapper[4825]: E1007 19:19:24.974034 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="660c15b2-9ce0-4ddf-9a41-4a4cc953972d" containerName="probe" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.974039 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="660c15b2-9ce0-4ddf-9a41-4a4cc953972d" containerName="probe" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.974311 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="660c15b2-9ce0-4ddf-9a41-4a4cc953972d" containerName="probe" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.974330 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="660c15b2-9ce0-4ddf-9a41-4a4cc953972d" containerName="cinder-scheduler" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.974346 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce0b553c-e9d5-4613-a202-65fc185f60b4" containerName="dnsmasq-dns" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.975221 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 19:19:24 crc kubenswrapper[4825]: I1007 19:19:24.977607 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 07 19:19:25 crc kubenswrapper[4825]: I1007 19:19:25.033644 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 19:19:25 crc kubenswrapper[4825]: I1007 19:19:25.166026 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06faa8d0-8aff-4422-a2cb-8643f6e920a8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"06faa8d0-8aff-4422-a2cb-8643f6e920a8\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:25 crc kubenswrapper[4825]: I1007 19:19:25.166101 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06faa8d0-8aff-4422-a2cb-8643f6e920a8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"06faa8d0-8aff-4422-a2cb-8643f6e920a8\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:25 crc kubenswrapper[4825]: I1007 19:19:25.166308 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r5lr\" (UniqueName: \"kubernetes.io/projected/06faa8d0-8aff-4422-a2cb-8643f6e920a8-kube-api-access-6r5lr\") pod \"cinder-scheduler-0\" (UID: \"06faa8d0-8aff-4422-a2cb-8643f6e920a8\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:25 crc kubenswrapper[4825]: I1007 19:19:25.166359 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06faa8d0-8aff-4422-a2cb-8643f6e920a8-scripts\") pod \"cinder-scheduler-0\" (UID: \"06faa8d0-8aff-4422-a2cb-8643f6e920a8\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:25 crc kubenswrapper[4825]: I1007 19:19:25.166464 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06faa8d0-8aff-4422-a2cb-8643f6e920a8-config-data\") pod \"cinder-scheduler-0\" (UID: \"06faa8d0-8aff-4422-a2cb-8643f6e920a8\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:25 crc kubenswrapper[4825]: I1007 19:19:25.166501 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06faa8d0-8aff-4422-a2cb-8643f6e920a8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"06faa8d0-8aff-4422-a2cb-8643f6e920a8\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:25 crc kubenswrapper[4825]: I1007 19:19:25.267643 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r5lr\" (UniqueName: \"kubernetes.io/projected/06faa8d0-8aff-4422-a2cb-8643f6e920a8-kube-api-access-6r5lr\") pod \"cinder-scheduler-0\" (UID: \"06faa8d0-8aff-4422-a2cb-8643f6e920a8\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:25 crc kubenswrapper[4825]: I1007 19:19:25.267697 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06faa8d0-8aff-4422-a2cb-8643f6e920a8-scripts\") pod \"cinder-scheduler-0\" (UID: \"06faa8d0-8aff-4422-a2cb-8643f6e920a8\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:25 crc kubenswrapper[4825]: I1007 19:19:25.267765 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06faa8d0-8aff-4422-a2cb-8643f6e920a8-config-data\") pod \"cinder-scheduler-0\" (UID: \"06faa8d0-8aff-4422-a2cb-8643f6e920a8\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:25 crc kubenswrapper[4825]: I1007 19:19:25.267800 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06faa8d0-8aff-4422-a2cb-8643f6e920a8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"06faa8d0-8aff-4422-a2cb-8643f6e920a8\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:25 crc kubenswrapper[4825]: I1007 19:19:25.267939 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06faa8d0-8aff-4422-a2cb-8643f6e920a8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"06faa8d0-8aff-4422-a2cb-8643f6e920a8\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:25 crc kubenswrapper[4825]: I1007 19:19:25.267974 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06faa8d0-8aff-4422-a2cb-8643f6e920a8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"06faa8d0-8aff-4422-a2cb-8643f6e920a8\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:25 crc kubenswrapper[4825]: I1007 19:19:25.268093 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06faa8d0-8aff-4422-a2cb-8643f6e920a8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"06faa8d0-8aff-4422-a2cb-8643f6e920a8\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:25 crc kubenswrapper[4825]: I1007 19:19:25.272028 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06faa8d0-8aff-4422-a2cb-8643f6e920a8-scripts\") pod \"cinder-scheduler-0\" (UID: \"06faa8d0-8aff-4422-a2cb-8643f6e920a8\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:25 crc kubenswrapper[4825]: I1007 19:19:25.273090 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06faa8d0-8aff-4422-a2cb-8643f6e920a8-config-data\") pod \"cinder-scheduler-0\" (UID: \"06faa8d0-8aff-4422-a2cb-8643f6e920a8\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:25 crc kubenswrapper[4825]: I1007 19:19:25.274849 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06faa8d0-8aff-4422-a2cb-8643f6e920a8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"06faa8d0-8aff-4422-a2cb-8643f6e920a8\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:25 crc kubenswrapper[4825]: I1007 19:19:25.276037 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06faa8d0-8aff-4422-a2cb-8643f6e920a8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"06faa8d0-8aff-4422-a2cb-8643f6e920a8\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:25 crc kubenswrapper[4825]: I1007 19:19:25.292863 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r5lr\" (UniqueName: \"kubernetes.io/projected/06faa8d0-8aff-4422-a2cb-8643f6e920a8-kube-api-access-6r5lr\") pod \"cinder-scheduler-0\" (UID: \"06faa8d0-8aff-4422-a2cb-8643f6e920a8\") " pod="openstack/cinder-scheduler-0" Oct 07 19:19:25 crc kubenswrapper[4825]: I1007 19:19:25.293638 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 19:19:25 crc kubenswrapper[4825]: I1007 19:19:25.770281 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 19:19:25 crc kubenswrapper[4825]: I1007 19:19:25.827566 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="660c15b2-9ce0-4ddf-9a41-4a4cc953972d" path="/var/lib/kubelet/pods/660c15b2-9ce0-4ddf-9a41-4a4cc953972d/volumes" Oct 07 19:19:25 crc kubenswrapper[4825]: I1007 19:19:25.923497 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"06faa8d0-8aff-4422-a2cb-8643f6e920a8","Type":"ContainerStarted","Data":"10663d314b5c7ba50080b84659cd7bf4bd0800f37ecbe2b725c0f74bc0679217"} Oct 07 19:19:26 crc kubenswrapper[4825]: I1007 19:19:26.706656 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:19:26 crc kubenswrapper[4825]: I1007 19:19:26.707190 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7f4b8c987b-kjdd8" Oct 07 19:19:26 crc kubenswrapper[4825]: I1007 19:19:26.946572 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"06faa8d0-8aff-4422-a2cb-8643f6e920a8","Type":"ContainerStarted","Data":"d572b19c04ea748bcffbb0b01183209d78562482714d42712045f0ee0f2a6e4b"} Oct 07 19:19:27 crc kubenswrapper[4825]: I1007 19:19:27.163078 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 07 19:19:27 crc kubenswrapper[4825]: I1007 19:19:27.493614 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6b848888b7-8bpk8" Oct 07 19:19:27 crc kubenswrapper[4825]: I1007 19:19:27.957631 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"06faa8d0-8aff-4422-a2cb-8643f6e920a8","Type":"ContainerStarted","Data":"3084faf9e00d24f7a13aea505dc289388295e53fab8412dfb324d22a1dad801a"} Oct 07 19:19:27 crc kubenswrapper[4825]: I1007 19:19:27.993813 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.993792415 podStartE2EDuration="3.993792415s" podCreationTimestamp="2025-10-07 19:19:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:19:27.984391028 +0000 UTC m=+1156.806429675" watchObservedRunningTime="2025-10-07 19:19:27.993792415 +0000 UTC m=+1156.815831052" Oct 07 19:19:28 crc kubenswrapper[4825]: I1007 19:19:28.528074 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d67bd544-4s2q8" Oct 07 19:19:28 crc kubenswrapper[4825]: I1007 19:19:28.576889 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-combined-ca-bundle\") pod \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\" (UID: \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\") " Oct 07 19:19:28 crc kubenswrapper[4825]: I1007 19:19:28.576934 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-httpd-config\") pod \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\" (UID: \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\") " Oct 07 19:19:28 crc kubenswrapper[4825]: I1007 19:19:28.577065 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-config\") pod \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\" (UID: \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\") " Oct 07 19:19:28 crc kubenswrapper[4825]: I1007 19:19:28.577120 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-ovndb-tls-certs\") pod \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\" (UID: \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\") " Oct 07 19:19:28 crc kubenswrapper[4825]: I1007 19:19:28.577190 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-974lp\" (UniqueName: \"kubernetes.io/projected/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-kube-api-access-974lp\") pod \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\" (UID: \"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e\") " Oct 07 19:19:28 crc kubenswrapper[4825]: I1007 19:19:28.588253 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e" (UID: "7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:28 crc kubenswrapper[4825]: I1007 19:19:28.602610 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-kube-api-access-974lp" (OuterVolumeSpecName: "kube-api-access-974lp") pod "7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e" (UID: "7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e"). InnerVolumeSpecName "kube-api-access-974lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:19:28 crc kubenswrapper[4825]: I1007 19:19:28.634519 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-config" (OuterVolumeSpecName: "config") pod "7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e" (UID: "7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:28 crc kubenswrapper[4825]: I1007 19:19:28.644115 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e" (UID: "7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:28 crc kubenswrapper[4825]: I1007 19:19:28.673952 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e" (UID: "7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:28 crc kubenswrapper[4825]: I1007 19:19:28.681758 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-974lp\" (UniqueName: \"kubernetes.io/projected/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-kube-api-access-974lp\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:28 crc kubenswrapper[4825]: I1007 19:19:28.681789 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:28 crc kubenswrapper[4825]: I1007 19:19:28.681822 4825 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:28 crc kubenswrapper[4825]: I1007 19:19:28.681851 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:28 crc kubenswrapper[4825]: I1007 19:19:28.681863 4825 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:28 crc kubenswrapper[4825]: I1007 19:19:28.981888 4825 generic.go:334] "Generic (PLEG): container finished" podID="7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e" containerID="430b8417f7069c5ac78b65ba019fba33cf881d8bc2f3c7b4a4613012d49c9aab" exitCode=0 Oct 07 19:19:28 crc kubenswrapper[4825]: I1007 19:19:28.982330 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d67bd544-4s2q8" event={"ID":"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e","Type":"ContainerDied","Data":"430b8417f7069c5ac78b65ba019fba33cf881d8bc2f3c7b4a4613012d49c9aab"} Oct 07 19:19:28 crc kubenswrapper[4825]: I1007 19:19:28.982377 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d67bd544-4s2q8" Oct 07 19:19:28 crc kubenswrapper[4825]: I1007 19:19:28.982395 4825 scope.go:117] "RemoveContainer" containerID="09ada7c56e9f60831da9c75e12cab1b4387bfb80b9cfe3fb9b464fa7d29f7d7b" Oct 07 19:19:28 crc kubenswrapper[4825]: I1007 19:19:28.982383 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d67bd544-4s2q8" event={"ID":"7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e","Type":"ContainerDied","Data":"82c3dd9b7a5795f4cb1ff430c8d976cb273cc75df29b1e31c7c9da0f13f3c9cc"} Oct 07 19:19:29 crc kubenswrapper[4825]: I1007 19:19:29.021516 4825 scope.go:117] "RemoveContainer" containerID="430b8417f7069c5ac78b65ba019fba33cf881d8bc2f3c7b4a4613012d49c9aab" Oct 07 19:19:29 crc kubenswrapper[4825]: I1007 19:19:29.024352 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d67bd544-4s2q8"] Oct 07 19:19:29 crc kubenswrapper[4825]: I1007 19:19:29.032614 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-d67bd544-4s2q8"] Oct 07 19:19:29 crc kubenswrapper[4825]: I1007 19:19:29.080093 4825 scope.go:117] "RemoveContainer" containerID="09ada7c56e9f60831da9c75e12cab1b4387bfb80b9cfe3fb9b464fa7d29f7d7b" Oct 07 19:19:29 crc kubenswrapper[4825]: E1007 19:19:29.080531 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09ada7c56e9f60831da9c75e12cab1b4387bfb80b9cfe3fb9b464fa7d29f7d7b\": container with ID starting with 09ada7c56e9f60831da9c75e12cab1b4387bfb80b9cfe3fb9b464fa7d29f7d7b not found: ID does not exist" containerID="09ada7c56e9f60831da9c75e12cab1b4387bfb80b9cfe3fb9b464fa7d29f7d7b" Oct 07 19:19:29 crc kubenswrapper[4825]: I1007 19:19:29.080575 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ada7c56e9f60831da9c75e12cab1b4387bfb80b9cfe3fb9b464fa7d29f7d7b"} err="failed to get container status \"09ada7c56e9f60831da9c75e12cab1b4387bfb80b9cfe3fb9b464fa7d29f7d7b\": rpc error: code = NotFound desc = could not find container \"09ada7c56e9f60831da9c75e12cab1b4387bfb80b9cfe3fb9b464fa7d29f7d7b\": container with ID starting with 09ada7c56e9f60831da9c75e12cab1b4387bfb80b9cfe3fb9b464fa7d29f7d7b not found: ID does not exist" Oct 07 19:19:29 crc kubenswrapper[4825]: I1007 19:19:29.080605 4825 scope.go:117] "RemoveContainer" containerID="430b8417f7069c5ac78b65ba019fba33cf881d8bc2f3c7b4a4613012d49c9aab" Oct 07 19:19:29 crc kubenswrapper[4825]: E1007 19:19:29.080909 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"430b8417f7069c5ac78b65ba019fba33cf881d8bc2f3c7b4a4613012d49c9aab\": container with ID starting with 430b8417f7069c5ac78b65ba019fba33cf881d8bc2f3c7b4a4613012d49c9aab not found: ID does not exist" containerID="430b8417f7069c5ac78b65ba019fba33cf881d8bc2f3c7b4a4613012d49c9aab" Oct 07 19:19:29 crc kubenswrapper[4825]: I1007 19:19:29.080945 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"430b8417f7069c5ac78b65ba019fba33cf881d8bc2f3c7b4a4613012d49c9aab"} err="failed to get container status \"430b8417f7069c5ac78b65ba019fba33cf881d8bc2f3c7b4a4613012d49c9aab\": rpc error: code = NotFound desc = could not find container \"430b8417f7069c5ac78b65ba019fba33cf881d8bc2f3c7b4a4613012d49c9aab\": container with ID starting with 430b8417f7069c5ac78b65ba019fba33cf881d8bc2f3c7b4a4613012d49c9aab not found: ID does not exist" Oct 07 19:19:29 crc kubenswrapper[4825]: I1007 19:19:29.808253 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e" path="/var/lib/kubelet/pods/7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e/volumes" Oct 07 19:19:30 crc kubenswrapper[4825]: I1007 19:19:30.294348 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 07 19:19:30 crc kubenswrapper[4825]: I1007 19:19:30.409335 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-566c6c8d88-h74t9" podUID="b66fe3a9-9849-4219-badb-a0cecbb2a388" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.337174 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.337730 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f475d4d9-9da4-48b3-a999-0b53d1ef346c" containerName="ceilometer-central-agent" containerID="cri-o://6bd2699dce7647575a45a710a982173a379b5c0e9d45437acbbda5c7a4d62717" gracePeriod=30 Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.337825 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f475d4d9-9da4-48b3-a999-0b53d1ef346c" containerName="sg-core" containerID="cri-o://82c06c1fb3a3b098e32195151b334fdb1ca79a78466d65cda9c856f55c6238b1" gracePeriod=30 Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.337877 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f475d4d9-9da4-48b3-a999-0b53d1ef346c" containerName="ceilometer-notification-agent" containerID="cri-o://aa5a0553e86641282d137d798de16e1b9de03cf1d45e9d0fb725517914e54730" gracePeriod=30 Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.337904 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f475d4d9-9da4-48b3-a999-0b53d1ef346c" containerName="proxy-httpd" containerID="cri-o://80ab38ef827c51547550368942fb145f1972d84d6827c922550de0645a815544" gracePeriod=30 Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.346905 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.853257 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-57f8b4b869-t42c2"] Oct 07 19:19:31 crc kubenswrapper[4825]: E1007 19:19:31.853620 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e" containerName="neutron-api" Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.853637 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e" containerName="neutron-api" Oct 07 19:19:31 crc kubenswrapper[4825]: E1007 19:19:31.853659 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e" containerName="neutron-httpd" Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.853666 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e" containerName="neutron-httpd" Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.853848 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e" containerName="neutron-httpd" Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.853864 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eb39c5d-6f8d-407c-aeba-4fdd48b8cb0e" containerName="neutron-api" Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.854766 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.864780 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.864976 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.865088 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.882009 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-57f8b4b869-t42c2"] Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.951050 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeabd5f0-6573-402d-a5df-c0bc41d16a67-log-httpd\") pod \"swift-proxy-57f8b4b869-t42c2\" (UID: \"aeabd5f0-6573-402d-a5df-c0bc41d16a67\") " pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.951406 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeabd5f0-6573-402d-a5df-c0bc41d16a67-internal-tls-certs\") pod \"swift-proxy-57f8b4b869-t42c2\" (UID: \"aeabd5f0-6573-402d-a5df-c0bc41d16a67\") " pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.951427 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wddxm\" (UniqueName: \"kubernetes.io/projected/aeabd5f0-6573-402d-a5df-c0bc41d16a67-kube-api-access-wddxm\") pod \"swift-proxy-57f8b4b869-t42c2\" (UID: \"aeabd5f0-6573-402d-a5df-c0bc41d16a67\") " pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.951449 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aeabd5f0-6573-402d-a5df-c0bc41d16a67-etc-swift\") pod \"swift-proxy-57f8b4b869-t42c2\" (UID: \"aeabd5f0-6573-402d-a5df-c0bc41d16a67\") " pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.951476 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeabd5f0-6573-402d-a5df-c0bc41d16a67-config-data\") pod \"swift-proxy-57f8b4b869-t42c2\" (UID: \"aeabd5f0-6573-402d-a5df-c0bc41d16a67\") " pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.951504 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeabd5f0-6573-402d-a5df-c0bc41d16a67-combined-ca-bundle\") pod \"swift-proxy-57f8b4b869-t42c2\" (UID: \"aeabd5f0-6573-402d-a5df-c0bc41d16a67\") " pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.951540 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeabd5f0-6573-402d-a5df-c0bc41d16a67-public-tls-certs\") pod \"swift-proxy-57f8b4b869-t42c2\" (UID: \"aeabd5f0-6573-402d-a5df-c0bc41d16a67\") " pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.951560 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeabd5f0-6573-402d-a5df-c0bc41d16a67-run-httpd\") pod \"swift-proxy-57f8b4b869-t42c2\" (UID: \"aeabd5f0-6573-402d-a5df-c0bc41d16a67\") " pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.967152 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.968718 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.975625 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-9qn9l" Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.976759 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.977384 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 07 19:19:31 crc kubenswrapper[4825]: I1007 19:19:31.978074 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.017772 4825 generic.go:334] "Generic (PLEG): container finished" podID="f475d4d9-9da4-48b3-a999-0b53d1ef346c" containerID="80ab38ef827c51547550368942fb145f1972d84d6827c922550de0645a815544" exitCode=0 Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.017808 4825 generic.go:334] "Generic (PLEG): container finished" podID="f475d4d9-9da4-48b3-a999-0b53d1ef346c" containerID="82c06c1fb3a3b098e32195151b334fdb1ca79a78466d65cda9c856f55c6238b1" exitCode=2 Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.017817 4825 generic.go:334] "Generic (PLEG): container finished" podID="f475d4d9-9da4-48b3-a999-0b53d1ef346c" containerID="6bd2699dce7647575a45a710a982173a379b5c0e9d45437acbbda5c7a4d62717" exitCode=0 Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.017847 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f475d4d9-9da4-48b3-a999-0b53d1ef346c","Type":"ContainerDied","Data":"80ab38ef827c51547550368942fb145f1972d84d6827c922550de0645a815544"} Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.017875 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f475d4d9-9da4-48b3-a999-0b53d1ef346c","Type":"ContainerDied","Data":"82c06c1fb3a3b098e32195151b334fdb1ca79a78466d65cda9c856f55c6238b1"} Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.017886 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f475d4d9-9da4-48b3-a999-0b53d1ef346c","Type":"ContainerDied","Data":"6bd2699dce7647575a45a710a982173a379b5c0e9d45437acbbda5c7a4d62717"} Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.053150 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeabd5f0-6573-402d-a5df-c0bc41d16a67-config-data\") pod \"swift-proxy-57f8b4b869-t42c2\" (UID: \"aeabd5f0-6573-402d-a5df-c0bc41d16a67\") " pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.053210 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeabd5f0-6573-402d-a5df-c0bc41d16a67-combined-ca-bundle\") pod \"swift-proxy-57f8b4b869-t42c2\" (UID: \"aeabd5f0-6573-402d-a5df-c0bc41d16a67\") " pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.053272 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeabd5f0-6573-402d-a5df-c0bc41d16a67-public-tls-certs\") pod \"swift-proxy-57f8b4b869-t42c2\" (UID: \"aeabd5f0-6573-402d-a5df-c0bc41d16a67\") " pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.053293 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeabd5f0-6573-402d-a5df-c0bc41d16a67-run-httpd\") pod \"swift-proxy-57f8b4b869-t42c2\" (UID: \"aeabd5f0-6573-402d-a5df-c0bc41d16a67\") " pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.053326 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeabd5f0-6573-402d-a5df-c0bc41d16a67-log-httpd\") pod \"swift-proxy-57f8b4b869-t42c2\" (UID: \"aeabd5f0-6573-402d-a5df-c0bc41d16a67\") " pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.053348 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pzxd\" (UniqueName: \"kubernetes.io/projected/44d41a47-16c3-4bd1-be08-b06bd6f8734f-kube-api-access-4pzxd\") pod \"openstackclient\" (UID: \"44d41a47-16c3-4bd1-be08-b06bd6f8734f\") " pod="openstack/openstackclient" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.053376 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/44d41a47-16c3-4bd1-be08-b06bd6f8734f-openstack-config-secret\") pod \"openstackclient\" (UID: \"44d41a47-16c3-4bd1-be08-b06bd6f8734f\") " pod="openstack/openstackclient" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.053414 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/44d41a47-16c3-4bd1-be08-b06bd6f8734f-openstack-config\") pod \"openstackclient\" (UID: \"44d41a47-16c3-4bd1-be08-b06bd6f8734f\") " pod="openstack/openstackclient" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.053428 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d41a47-16c3-4bd1-be08-b06bd6f8734f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"44d41a47-16c3-4bd1-be08-b06bd6f8734f\") " pod="openstack/openstackclient" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.053470 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeabd5f0-6573-402d-a5df-c0bc41d16a67-internal-tls-certs\") pod \"swift-proxy-57f8b4b869-t42c2\" (UID: \"aeabd5f0-6573-402d-a5df-c0bc41d16a67\") " pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.053487 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wddxm\" (UniqueName: \"kubernetes.io/projected/aeabd5f0-6573-402d-a5df-c0bc41d16a67-kube-api-access-wddxm\") pod \"swift-proxy-57f8b4b869-t42c2\" (UID: \"aeabd5f0-6573-402d-a5df-c0bc41d16a67\") " pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.053508 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aeabd5f0-6573-402d-a5df-c0bc41d16a67-etc-swift\") pod \"swift-proxy-57f8b4b869-t42c2\" (UID: \"aeabd5f0-6573-402d-a5df-c0bc41d16a67\") " pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.055346 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeabd5f0-6573-402d-a5df-c0bc41d16a67-run-httpd\") pod \"swift-proxy-57f8b4b869-t42c2\" (UID: \"aeabd5f0-6573-402d-a5df-c0bc41d16a67\") " pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.055610 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeabd5f0-6573-402d-a5df-c0bc41d16a67-log-httpd\") pod \"swift-proxy-57f8b4b869-t42c2\" (UID: \"aeabd5f0-6573-402d-a5df-c0bc41d16a67\") " pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.061493 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aeabd5f0-6573-402d-a5df-c0bc41d16a67-etc-swift\") pod \"swift-proxy-57f8b4b869-t42c2\" (UID: \"aeabd5f0-6573-402d-a5df-c0bc41d16a67\") " pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.062331 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeabd5f0-6573-402d-a5df-c0bc41d16a67-combined-ca-bundle\") pod \"swift-proxy-57f8b4b869-t42c2\" (UID: \"aeabd5f0-6573-402d-a5df-c0bc41d16a67\") " pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.062489 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeabd5f0-6573-402d-a5df-c0bc41d16a67-config-data\") pod \"swift-proxy-57f8b4b869-t42c2\" (UID: \"aeabd5f0-6573-402d-a5df-c0bc41d16a67\") " pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.062924 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeabd5f0-6573-402d-a5df-c0bc41d16a67-internal-tls-certs\") pod \"swift-proxy-57f8b4b869-t42c2\" (UID: \"aeabd5f0-6573-402d-a5df-c0bc41d16a67\") " pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.072028 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wddxm\" (UniqueName: \"kubernetes.io/projected/aeabd5f0-6573-402d-a5df-c0bc41d16a67-kube-api-access-wddxm\") pod \"swift-proxy-57f8b4b869-t42c2\" (UID: \"aeabd5f0-6573-402d-a5df-c0bc41d16a67\") " pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.077757 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeabd5f0-6573-402d-a5df-c0bc41d16a67-public-tls-certs\") pod \"swift-proxy-57f8b4b869-t42c2\" (UID: \"aeabd5f0-6573-402d-a5df-c0bc41d16a67\") " pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.154579 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pzxd\" (UniqueName: \"kubernetes.io/projected/44d41a47-16c3-4bd1-be08-b06bd6f8734f-kube-api-access-4pzxd\") pod \"openstackclient\" (UID: \"44d41a47-16c3-4bd1-be08-b06bd6f8734f\") " pod="openstack/openstackclient" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.154634 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/44d41a47-16c3-4bd1-be08-b06bd6f8734f-openstack-config-secret\") pod \"openstackclient\" (UID: \"44d41a47-16c3-4bd1-be08-b06bd6f8734f\") " pod="openstack/openstackclient" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.154672 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/44d41a47-16c3-4bd1-be08-b06bd6f8734f-openstack-config\") pod \"openstackclient\" (UID: \"44d41a47-16c3-4bd1-be08-b06bd6f8734f\") " pod="openstack/openstackclient" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.154687 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d41a47-16c3-4bd1-be08-b06bd6f8734f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"44d41a47-16c3-4bd1-be08-b06bd6f8734f\") " pod="openstack/openstackclient" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.155753 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/44d41a47-16c3-4bd1-be08-b06bd6f8734f-openstack-config\") pod \"openstackclient\" (UID: \"44d41a47-16c3-4bd1-be08-b06bd6f8734f\") " pod="openstack/openstackclient" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.159522 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/44d41a47-16c3-4bd1-be08-b06bd6f8734f-openstack-config-secret\") pod \"openstackclient\" (UID: \"44d41a47-16c3-4bd1-be08-b06bd6f8734f\") " pod="openstack/openstackclient" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.159898 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d41a47-16c3-4bd1-be08-b06bd6f8734f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"44d41a47-16c3-4bd1-be08-b06bd6f8734f\") " pod="openstack/openstackclient" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.174562 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pzxd\" (UniqueName: \"kubernetes.io/projected/44d41a47-16c3-4bd1-be08-b06bd6f8734f-kube-api-access-4pzxd\") pod \"openstackclient\" (UID: \"44d41a47-16c3-4bd1-be08-b06bd6f8734f\") " pod="openstack/openstackclient" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.186023 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.290653 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.739935 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-57f8b4b869-t42c2"] Oct 07 19:19:32 crc kubenswrapper[4825]: I1007 19:19:32.801778 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 07 19:19:32 crc kubenswrapper[4825]: W1007 19:19:32.807700 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44d41a47_16c3_4bd1_be08_b06bd6f8734f.slice/crio-0ab913e4e58d5efddfe9be2fbf9cb474e3c66a8c8aac770662e49fa0016fba0f WatchSource:0}: Error finding container 0ab913e4e58d5efddfe9be2fbf9cb474e3c66a8c8aac770662e49fa0016fba0f: Status 404 returned error can't find the container with id 0ab913e4e58d5efddfe9be2fbf9cb474e3c66a8c8aac770662e49fa0016fba0f Oct 07 19:19:33 crc kubenswrapper[4825]: I1007 19:19:33.030348 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"44d41a47-16c3-4bd1-be08-b06bd6f8734f","Type":"ContainerStarted","Data":"0ab913e4e58d5efddfe9be2fbf9cb474e3c66a8c8aac770662e49fa0016fba0f"} Oct 07 19:19:33 crc kubenswrapper[4825]: I1007 19:19:33.032031 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-57f8b4b869-t42c2" event={"ID":"aeabd5f0-6573-402d-a5df-c0bc41d16a67","Type":"ContainerStarted","Data":"df50d014522f843413b07baddeff96c4ca51229e1b36bb929569eb9313cc6ab0"} Oct 07 19:19:33 crc kubenswrapper[4825]: I1007 19:19:33.032082 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-57f8b4b869-t42c2" event={"ID":"aeabd5f0-6573-402d-a5df-c0bc41d16a67","Type":"ContainerStarted","Data":"a24f85220f31dcb93c80d87136930c94db5e36a81f52c205fcf7ff31cb993903"} Oct 07 19:19:34 crc kubenswrapper[4825]: I1007 19:19:34.040910 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-57f8b4b869-t42c2" event={"ID":"aeabd5f0-6573-402d-a5df-c0bc41d16a67","Type":"ContainerStarted","Data":"ea10036987b12909b9eab498038a260d9dd0cb33c2feb0a750eec502f5dc5584"} Oct 07 19:19:34 crc kubenswrapper[4825]: I1007 19:19:34.041157 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:34 crc kubenswrapper[4825]: I1007 19:19:34.066611 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-57f8b4b869-t42c2" podStartSLOduration=3.066595948 podStartE2EDuration="3.066595948s" podCreationTimestamp="2025-10-07 19:19:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:19:34.06259691 +0000 UTC m=+1162.884635557" watchObservedRunningTime="2025-10-07 19:19:34.066595948 +0000 UTC m=+1162.888634585" Oct 07 19:19:35 crc kubenswrapper[4825]: I1007 19:19:35.061562 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:35 crc kubenswrapper[4825]: I1007 19:19:35.492202 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 07 19:19:35 crc kubenswrapper[4825]: I1007 19:19:35.718534 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:19:35 crc kubenswrapper[4825]: I1007 19:19:35.718628 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:19:36 crc kubenswrapper[4825]: I1007 19:19:36.085826 4825 generic.go:334] "Generic (PLEG): container finished" podID="f475d4d9-9da4-48b3-a999-0b53d1ef346c" containerID="aa5a0553e86641282d137d798de16e1b9de03cf1d45e9d0fb725517914e54730" exitCode=0 Oct 07 19:19:36 crc kubenswrapper[4825]: I1007 19:19:36.085874 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f475d4d9-9da4-48b3-a999-0b53d1ef346c","Type":"ContainerDied","Data":"aa5a0553e86641282d137d798de16e1b9de03cf1d45e9d0fb725517914e54730"} Oct 07 19:19:36 crc kubenswrapper[4825]: I1007 19:19:36.345473 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 19:19:36 crc kubenswrapper[4825]: I1007 19:19:36.438459 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f475d4d9-9da4-48b3-a999-0b53d1ef346c-log-httpd\") pod \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " Oct 07 19:19:36 crc kubenswrapper[4825]: I1007 19:19:36.438497 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f475d4d9-9da4-48b3-a999-0b53d1ef346c-scripts\") pod \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " Oct 07 19:19:36 crc kubenswrapper[4825]: I1007 19:19:36.438525 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4kzb\" (UniqueName: \"kubernetes.io/projected/f475d4d9-9da4-48b3-a999-0b53d1ef346c-kube-api-access-v4kzb\") pod \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " Oct 07 19:19:36 crc kubenswrapper[4825]: I1007 19:19:36.438595 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f475d4d9-9da4-48b3-a999-0b53d1ef346c-config-data\") pod \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " Oct 07 19:19:36 crc kubenswrapper[4825]: I1007 19:19:36.438637 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f475d4d9-9da4-48b3-a999-0b53d1ef346c-sg-core-conf-yaml\") pod \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " Oct 07 19:19:36 crc kubenswrapper[4825]: I1007 19:19:36.438678 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f475d4d9-9da4-48b3-a999-0b53d1ef346c-combined-ca-bundle\") pod \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " Oct 07 19:19:36 crc kubenswrapper[4825]: I1007 19:19:36.438723 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f475d4d9-9da4-48b3-a999-0b53d1ef346c-run-httpd\") pod \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\" (UID: \"f475d4d9-9da4-48b3-a999-0b53d1ef346c\") " Oct 07 19:19:36 crc kubenswrapper[4825]: I1007 19:19:36.439200 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f475d4d9-9da4-48b3-a999-0b53d1ef346c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f475d4d9-9da4-48b3-a999-0b53d1ef346c" (UID: "f475d4d9-9da4-48b3-a999-0b53d1ef346c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:19:36 crc kubenswrapper[4825]: I1007 19:19:36.439418 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f475d4d9-9da4-48b3-a999-0b53d1ef346c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f475d4d9-9da4-48b3-a999-0b53d1ef346c" (UID: "f475d4d9-9da4-48b3-a999-0b53d1ef346c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:19:36 crc kubenswrapper[4825]: I1007 19:19:36.444370 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f475d4d9-9da4-48b3-a999-0b53d1ef346c-kube-api-access-v4kzb" (OuterVolumeSpecName: "kube-api-access-v4kzb") pod "f475d4d9-9da4-48b3-a999-0b53d1ef346c" (UID: "f475d4d9-9da4-48b3-a999-0b53d1ef346c"). InnerVolumeSpecName "kube-api-access-v4kzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:19:36 crc kubenswrapper[4825]: I1007 19:19:36.444927 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f475d4d9-9da4-48b3-a999-0b53d1ef346c-scripts" (OuterVolumeSpecName: "scripts") pod "f475d4d9-9da4-48b3-a999-0b53d1ef346c" (UID: "f475d4d9-9da4-48b3-a999-0b53d1ef346c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:36 crc kubenswrapper[4825]: I1007 19:19:36.472559 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f475d4d9-9da4-48b3-a999-0b53d1ef346c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f475d4d9-9da4-48b3-a999-0b53d1ef346c" (UID: "f475d4d9-9da4-48b3-a999-0b53d1ef346c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:36 crc kubenswrapper[4825]: I1007 19:19:36.522791 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f475d4d9-9da4-48b3-a999-0b53d1ef346c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f475d4d9-9da4-48b3-a999-0b53d1ef346c" (UID: "f475d4d9-9da4-48b3-a999-0b53d1ef346c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:36 crc kubenswrapper[4825]: I1007 19:19:36.542378 4825 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f475d4d9-9da4-48b3-a999-0b53d1ef346c-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:36 crc kubenswrapper[4825]: I1007 19:19:36.542679 4825 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f475d4d9-9da4-48b3-a999-0b53d1ef346c-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:36 crc kubenswrapper[4825]: I1007 19:19:36.542804 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f475d4d9-9da4-48b3-a999-0b53d1ef346c-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:36 crc kubenswrapper[4825]: I1007 19:19:36.542904 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4kzb\" (UniqueName: \"kubernetes.io/projected/f475d4d9-9da4-48b3-a999-0b53d1ef346c-kube-api-access-v4kzb\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:36 crc kubenswrapper[4825]: I1007 19:19:36.543016 4825 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f475d4d9-9da4-48b3-a999-0b53d1ef346c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:36 crc kubenswrapper[4825]: I1007 19:19:36.543127 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f475d4d9-9da4-48b3-a999-0b53d1ef346c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:36 crc kubenswrapper[4825]: I1007 19:19:36.560458 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f475d4d9-9da4-48b3-a999-0b53d1ef346c-config-data" (OuterVolumeSpecName: "config-data") pod "f475d4d9-9da4-48b3-a999-0b53d1ef346c" (UID: "f475d4d9-9da4-48b3-a999-0b53d1ef346c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:36 crc kubenswrapper[4825]: I1007 19:19:36.645190 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f475d4d9-9da4-48b3-a999-0b53d1ef346c-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.096953 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f475d4d9-9da4-48b3-a999-0b53d1ef346c","Type":"ContainerDied","Data":"10427bd48ec7d7dfea3171e0a1ffee9d926bbb4ffde055dcf5e657db3d47c6b8"} Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.097015 4825 scope.go:117] "RemoveContainer" containerID="80ab38ef827c51547550368942fb145f1972d84d6827c922550de0645a815544" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.097039 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.135837 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.142948 4825 scope.go:117] "RemoveContainer" containerID="82c06c1fb3a3b098e32195151b334fdb1ca79a78466d65cda9c856f55c6238b1" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.153907 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.163409 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:19:37 crc kubenswrapper[4825]: E1007 19:19:37.163886 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f475d4d9-9da4-48b3-a999-0b53d1ef346c" containerName="proxy-httpd" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.163906 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f475d4d9-9da4-48b3-a999-0b53d1ef346c" containerName="proxy-httpd" Oct 07 19:19:37 crc kubenswrapper[4825]: E1007 19:19:37.163925 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f475d4d9-9da4-48b3-a999-0b53d1ef346c" containerName="ceilometer-central-agent" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.163933 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f475d4d9-9da4-48b3-a999-0b53d1ef346c" containerName="ceilometer-central-agent" Oct 07 19:19:37 crc kubenswrapper[4825]: E1007 19:19:37.163953 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f475d4d9-9da4-48b3-a999-0b53d1ef346c" containerName="ceilometer-notification-agent" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.163963 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f475d4d9-9da4-48b3-a999-0b53d1ef346c" containerName="ceilometer-notification-agent" Oct 07 19:19:37 crc kubenswrapper[4825]: E1007 19:19:37.163992 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f475d4d9-9da4-48b3-a999-0b53d1ef346c" containerName="sg-core" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.164001 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f475d4d9-9da4-48b3-a999-0b53d1ef346c" containerName="sg-core" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.164273 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f475d4d9-9da4-48b3-a999-0b53d1ef346c" containerName="sg-core" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.164306 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f475d4d9-9da4-48b3-a999-0b53d1ef346c" containerName="ceilometer-notification-agent" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.164326 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f475d4d9-9da4-48b3-a999-0b53d1ef346c" containerName="proxy-httpd" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.164341 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f475d4d9-9da4-48b3-a999-0b53d1ef346c" containerName="ceilometer-central-agent" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.166598 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.168346 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.168462 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.171751 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.190833 4825 scope.go:117] "RemoveContainer" containerID="aa5a0553e86641282d137d798de16e1b9de03cf1d45e9d0fb725517914e54730" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.195667 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.217931 4825 scope.go:117] "RemoveContainer" containerID="6bd2699dce7647575a45a710a982173a379b5c0e9d45437acbbda5c7a4d62717" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.254361 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-run-httpd\") pod \"ceilometer-0\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " pod="openstack/ceilometer-0" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.254413 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lrq7\" (UniqueName: \"kubernetes.io/projected/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-kube-api-access-2lrq7\") pod \"ceilometer-0\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " pod="openstack/ceilometer-0" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.254468 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " pod="openstack/ceilometer-0" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.254656 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " pod="openstack/ceilometer-0" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.254842 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-log-httpd\") pod \"ceilometer-0\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " pod="openstack/ceilometer-0" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.254927 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-scripts\") pod \"ceilometer-0\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " pod="openstack/ceilometer-0" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.255046 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-config-data\") pod \"ceilometer-0\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " pod="openstack/ceilometer-0" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.356988 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-run-httpd\") pod \"ceilometer-0\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " pod="openstack/ceilometer-0" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.357031 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lrq7\" (UniqueName: \"kubernetes.io/projected/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-kube-api-access-2lrq7\") pod \"ceilometer-0\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " pod="openstack/ceilometer-0" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.357074 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " pod="openstack/ceilometer-0" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.357096 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " pod="openstack/ceilometer-0" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.357133 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-log-httpd\") pod \"ceilometer-0\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " pod="openstack/ceilometer-0" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.357160 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-scripts\") pod \"ceilometer-0\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " pod="openstack/ceilometer-0" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.357185 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-config-data\") pod \"ceilometer-0\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " pod="openstack/ceilometer-0" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.358295 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-log-httpd\") pod \"ceilometer-0\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " pod="openstack/ceilometer-0" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.360135 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-run-httpd\") pod \"ceilometer-0\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " pod="openstack/ceilometer-0" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.362173 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-config-data\") pod \"ceilometer-0\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " pod="openstack/ceilometer-0" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.362199 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " pod="openstack/ceilometer-0" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.363619 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-scripts\") pod \"ceilometer-0\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " pod="openstack/ceilometer-0" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.370822 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " pod="openstack/ceilometer-0" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.384973 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lrq7\" (UniqueName: \"kubernetes.io/projected/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-kube-api-access-2lrq7\") pod \"ceilometer-0\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " pod="openstack/ceilometer-0" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.493635 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.510951 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.511153 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="66c0c344-091c-42cf-bfbb-bbdc83a37bce" containerName="kube-state-metrics" containerID="cri-o://a9869f60b400315b68c3bb0bd77d641bee19275a612ab02aef9c433afc62f275" gracePeriod=30 Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.813076 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f475d4d9-9da4-48b3-a999-0b53d1ef346c" path="/var/lib/kubelet/pods/f475d4d9-9da4-48b3-a999-0b53d1ef346c/volumes" Oct 07 19:19:37 crc kubenswrapper[4825]: I1007 19:19:37.963041 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.018151 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.071541 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h2s5\" (UniqueName: \"kubernetes.io/projected/66c0c344-091c-42cf-bfbb-bbdc83a37bce-kube-api-access-8h2s5\") pod \"66c0c344-091c-42cf-bfbb-bbdc83a37bce\" (UID: \"66c0c344-091c-42cf-bfbb-bbdc83a37bce\") " Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.078408 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66c0c344-091c-42cf-bfbb-bbdc83a37bce-kube-api-access-8h2s5" (OuterVolumeSpecName: "kube-api-access-8h2s5") pod "66c0c344-091c-42cf-bfbb-bbdc83a37bce" (UID: "66c0c344-091c-42cf-bfbb-bbdc83a37bce"). InnerVolumeSpecName "kube-api-access-8h2s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.109940 4825 generic.go:334] "Generic (PLEG): container finished" podID="66c0c344-091c-42cf-bfbb-bbdc83a37bce" containerID="a9869f60b400315b68c3bb0bd77d641bee19275a612ab02aef9c433afc62f275" exitCode=2 Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.109997 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"66c0c344-091c-42cf-bfbb-bbdc83a37bce","Type":"ContainerDied","Data":"a9869f60b400315b68c3bb0bd77d641bee19275a612ab02aef9c433afc62f275"} Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.110022 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"66c0c344-091c-42cf-bfbb-bbdc83a37bce","Type":"ContainerDied","Data":"74d425e7f70445a3dd181fd6150a288a7da1df26ac7a1c2d2185b83d9bc3e091"} Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.110039 4825 scope.go:117] "RemoveContainer" containerID="a9869f60b400315b68c3bb0bd77d641bee19275a612ab02aef9c433afc62f275" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.110124 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.115637 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a6e447f-e2b4-45ec-999a-08b00d7fe14a","Type":"ContainerStarted","Data":"11bd60d957ebd651bcaeca3b0c605bab249e9502d10596daef471538f98a031c"} Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.147177 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.149135 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.158023 4825 scope.go:117] "RemoveContainer" containerID="a9869f60b400315b68c3bb0bd77d641bee19275a612ab02aef9c433afc62f275" Oct 07 19:19:38 crc kubenswrapper[4825]: E1007 19:19:38.158929 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9869f60b400315b68c3bb0bd77d641bee19275a612ab02aef9c433afc62f275\": container with ID starting with a9869f60b400315b68c3bb0bd77d641bee19275a612ab02aef9c433afc62f275 not found: ID does not exist" containerID="a9869f60b400315b68c3bb0bd77d641bee19275a612ab02aef9c433afc62f275" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.158971 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9869f60b400315b68c3bb0bd77d641bee19275a612ab02aef9c433afc62f275"} err="failed to get container status \"a9869f60b400315b68c3bb0bd77d641bee19275a612ab02aef9c433afc62f275\": rpc error: code = NotFound desc = could not find container \"a9869f60b400315b68c3bb0bd77d641bee19275a612ab02aef9c433afc62f275\": container with ID starting with a9869f60b400315b68c3bb0bd77d641bee19275a612ab02aef9c433afc62f275 not found: ID does not exist" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.159270 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 19:19:38 crc kubenswrapper[4825]: E1007 19:19:38.159548 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c0c344-091c-42cf-bfbb-bbdc83a37bce" containerName="kube-state-metrics" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.159559 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c0c344-091c-42cf-bfbb-bbdc83a37bce" containerName="kube-state-metrics" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.159720 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="66c0c344-091c-42cf-bfbb-bbdc83a37bce" containerName="kube-state-metrics" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.171681 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.178639 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.179046 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.179530 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8h2s5\" (UniqueName: \"kubernetes.io/projected/66c0c344-091c-42cf-bfbb-bbdc83a37bce-kube-api-access-8h2s5\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.179759 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.281443 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e\") " pod="openstack/kube-state-metrics-0" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.281914 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e\") " pod="openstack/kube-state-metrics-0" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.281942 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdlb9\" (UniqueName: \"kubernetes.io/projected/f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e-kube-api-access-gdlb9\") pod \"kube-state-metrics-0\" (UID: \"f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e\") " pod="openstack/kube-state-metrics-0" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.282007 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e\") " pod="openstack/kube-state-metrics-0" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.384254 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e\") " pod="openstack/kube-state-metrics-0" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.384303 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e\") " pod="openstack/kube-state-metrics-0" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.384325 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdlb9\" (UniqueName: \"kubernetes.io/projected/f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e-kube-api-access-gdlb9\") pod \"kube-state-metrics-0\" (UID: \"f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e\") " pod="openstack/kube-state-metrics-0" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.384379 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e\") " pod="openstack/kube-state-metrics-0" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.388942 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e\") " pod="openstack/kube-state-metrics-0" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.392891 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e\") " pod="openstack/kube-state-metrics-0" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.401477 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e\") " pod="openstack/kube-state-metrics-0" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.404815 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdlb9\" (UniqueName: \"kubernetes.io/projected/f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e-kube-api-access-gdlb9\") pod \"kube-state-metrics-0\" (UID: \"f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e\") " pod="openstack/kube-state-metrics-0" Oct 07 19:19:38 crc kubenswrapper[4825]: I1007 19:19:38.503773 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 19:19:39 crc kubenswrapper[4825]: I1007 19:19:39.390454 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:19:39 crc kubenswrapper[4825]: I1007 19:19:39.805264 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66c0c344-091c-42cf-bfbb-bbdc83a37bce" path="/var/lib/kubelet/pods/66c0c344-091c-42cf-bfbb-bbdc83a37bce/volumes" Oct 07 19:19:40 crc kubenswrapper[4825]: I1007 19:19:40.409493 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-566c6c8d88-h74t9" podUID="b66fe3a9-9849-4219-badb-a0cecbb2a388" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Oct 07 19:19:40 crc kubenswrapper[4825]: I1007 19:19:40.409869 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:19:42 crc kubenswrapper[4825]: I1007 19:19:42.197563 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-57f8b4b869-t42c2" Oct 07 19:19:43 crc kubenswrapper[4825]: W1007 19:19:43.814215 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8d0cfb5_1a41_442c_b8d3_b1f3e2d8418e.slice/crio-da45c49303db463e06856eb86e2e5d379127a6bac437feba18c3ef0abceece45 WatchSource:0}: Error finding container da45c49303db463e06856eb86e2e5d379127a6bac437feba18c3ef0abceece45: Status 404 returned error can't find the container with id da45c49303db463e06856eb86e2e5d379127a6bac437feba18c3ef0abceece45 Oct 07 19:19:43 crc kubenswrapper[4825]: I1007 19:19:43.816075 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 19:19:44 crc kubenswrapper[4825]: I1007 19:19:44.187865 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e","Type":"ContainerStarted","Data":"da45c49303db463e06856eb86e2e5d379127a6bac437feba18c3ef0abceece45"} Oct 07 19:19:44 crc kubenswrapper[4825]: I1007 19:19:44.189953 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a6e447f-e2b4-45ec-999a-08b00d7fe14a","Type":"ContainerStarted","Data":"12c8d43abe88e1b10f5cb2180cb77b8de0ddda2c786ed039ee853b2bb37e81cf"} Oct 07 19:19:44 crc kubenswrapper[4825]: I1007 19:19:44.192630 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"44d41a47-16c3-4bd1-be08-b06bd6f8734f","Type":"ContainerStarted","Data":"fde6255a0c88dea5569017022d6e3dfca32a4dd989343ea5d258073a88f7542e"} Oct 07 19:19:44 crc kubenswrapper[4825]: I1007 19:19:44.207995 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.493967075 podStartE2EDuration="13.207979069s" podCreationTimestamp="2025-10-07 19:19:31 +0000 UTC" firstStartedPulling="2025-10-07 19:19:32.814381084 +0000 UTC m=+1161.636419721" lastFinishedPulling="2025-10-07 19:19:43.528393078 +0000 UTC m=+1172.350431715" observedRunningTime="2025-10-07 19:19:44.207336488 +0000 UTC m=+1173.029375125" watchObservedRunningTime="2025-10-07 19:19:44.207979069 +0000 UTC m=+1173.030017706" Oct 07 19:19:45 crc kubenswrapper[4825]: I1007 19:19:45.007374 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 19:19:45 crc kubenswrapper[4825]: I1007 19:19:45.007802 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="05e1cdf5-2de6-438e-b0f1-b05a7c1a2779" containerName="glance-log" containerID="cri-o://26500b8f142625bd8a7d7547a151da17091c720ad58d1b05f0226cafe735907c" gracePeriod=30 Oct 07 19:19:45 crc kubenswrapper[4825]: I1007 19:19:45.007908 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="05e1cdf5-2de6-438e-b0f1-b05a7c1a2779" containerName="glance-httpd" containerID="cri-o://a4559148f6852c76e778a887208ad48edafe31301d7b04b04ef0e1b0ab844138" gracePeriod=30 Oct 07 19:19:45 crc kubenswrapper[4825]: I1007 19:19:45.205359 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e","Type":"ContainerStarted","Data":"a3a10804e9705dc3749b3ceee8adbec137c8aa07c4ffaadf352396a3775f19e4"} Oct 07 19:19:45 crc kubenswrapper[4825]: I1007 19:19:45.205438 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 07 19:19:45 crc kubenswrapper[4825]: I1007 19:19:45.208711 4825 generic.go:334] "Generic (PLEG): container finished" podID="05e1cdf5-2de6-438e-b0f1-b05a7c1a2779" containerID="26500b8f142625bd8a7d7547a151da17091c720ad58d1b05f0226cafe735907c" exitCode=143 Oct 07 19:19:45 crc kubenswrapper[4825]: I1007 19:19:45.208778 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779","Type":"ContainerDied","Data":"26500b8f142625bd8a7d7547a151da17091c720ad58d1b05f0226cafe735907c"} Oct 07 19:19:45 crc kubenswrapper[4825]: I1007 19:19:45.211997 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a6e447f-e2b4-45ec-999a-08b00d7fe14a","Type":"ContainerStarted","Data":"317ff6948ba4ca27286b8513b49e1bc6ce6e12f0454f4f345abc1c8c36003d16"} Oct 07 19:19:45 crc kubenswrapper[4825]: I1007 19:19:45.212035 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a6e447f-e2b4-45ec-999a-08b00d7fe14a","Type":"ContainerStarted","Data":"af786fd2fe48c52347838052ac7346b57db05c49d101bb0f04ea95f9672f71d5"} Oct 07 19:19:45 crc kubenswrapper[4825]: I1007 19:19:45.227723 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=6.815257692 podStartE2EDuration="7.22770794s" podCreationTimestamp="2025-10-07 19:19:38 +0000 UTC" firstStartedPulling="2025-10-07 19:19:43.822120505 +0000 UTC m=+1172.644159142" lastFinishedPulling="2025-10-07 19:19:44.234570753 +0000 UTC m=+1173.056609390" observedRunningTime="2025-10-07 19:19:45.22236016 +0000 UTC m=+1174.044398797" watchObservedRunningTime="2025-10-07 19:19:45.22770794 +0000 UTC m=+1174.049746577" Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.140515 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.222484 4825 generic.go:334] "Generic (PLEG): container finished" podID="b66fe3a9-9849-4219-badb-a0cecbb2a388" containerID="aea12da3989f209c9f5f62775f725a8aaf79a64c22af740047e00c937865a840" exitCode=137 Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.222531 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-566c6c8d88-h74t9" event={"ID":"b66fe3a9-9849-4219-badb-a0cecbb2a388","Type":"ContainerDied","Data":"aea12da3989f209c9f5f62775f725a8aaf79a64c22af740047e00c937865a840"} Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.222564 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-566c6c8d88-h74t9" Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.222598 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-566c6c8d88-h74t9" event={"ID":"b66fe3a9-9849-4219-badb-a0cecbb2a388","Type":"ContainerDied","Data":"19a90f1f23ed8a8f03b71fa604a7882b2bb5e3ffd46fc3e08e208306b33d96f5"} Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.222632 4825 scope.go:117] "RemoveContainer" containerID="b617ed135554f80f64aa57a58a8e8eeb06038dec9d1d88a161cbbda0bc1b3b20" Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.280292 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b66fe3a9-9849-4219-badb-a0cecbb2a388-logs\") pod \"b66fe3a9-9849-4219-badb-a0cecbb2a388\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.280377 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b66fe3a9-9849-4219-badb-a0cecbb2a388-combined-ca-bundle\") pod \"b66fe3a9-9849-4219-badb-a0cecbb2a388\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.280396 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b66fe3a9-9849-4219-badb-a0cecbb2a388-config-data\") pod \"b66fe3a9-9849-4219-badb-a0cecbb2a388\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.280442 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b66fe3a9-9849-4219-badb-a0cecbb2a388-scripts\") pod \"b66fe3a9-9849-4219-badb-a0cecbb2a388\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.280511 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66fe3a9-9849-4219-badb-a0cecbb2a388-horizon-tls-certs\") pod \"b66fe3a9-9849-4219-badb-a0cecbb2a388\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.280576 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk2s7\" (UniqueName: \"kubernetes.io/projected/b66fe3a9-9849-4219-badb-a0cecbb2a388-kube-api-access-vk2s7\") pod \"b66fe3a9-9849-4219-badb-a0cecbb2a388\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.280604 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b66fe3a9-9849-4219-badb-a0cecbb2a388-horizon-secret-key\") pod \"b66fe3a9-9849-4219-badb-a0cecbb2a388\" (UID: \"b66fe3a9-9849-4219-badb-a0cecbb2a388\") " Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.281702 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b66fe3a9-9849-4219-badb-a0cecbb2a388-logs" (OuterVolumeSpecName: "logs") pod "b66fe3a9-9849-4219-badb-a0cecbb2a388" (UID: "b66fe3a9-9849-4219-badb-a0cecbb2a388"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.290339 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b66fe3a9-9849-4219-badb-a0cecbb2a388-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b66fe3a9-9849-4219-badb-a0cecbb2a388" (UID: "b66fe3a9-9849-4219-badb-a0cecbb2a388"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.292344 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b66fe3a9-9849-4219-badb-a0cecbb2a388-kube-api-access-vk2s7" (OuterVolumeSpecName: "kube-api-access-vk2s7") pod "b66fe3a9-9849-4219-badb-a0cecbb2a388" (UID: "b66fe3a9-9849-4219-badb-a0cecbb2a388"). InnerVolumeSpecName "kube-api-access-vk2s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.313579 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b66fe3a9-9849-4219-badb-a0cecbb2a388-scripts" (OuterVolumeSpecName: "scripts") pod "b66fe3a9-9849-4219-badb-a0cecbb2a388" (UID: "b66fe3a9-9849-4219-badb-a0cecbb2a388"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.315862 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b66fe3a9-9849-4219-badb-a0cecbb2a388-config-data" (OuterVolumeSpecName: "config-data") pod "b66fe3a9-9849-4219-badb-a0cecbb2a388" (UID: "b66fe3a9-9849-4219-badb-a0cecbb2a388"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.337680 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b66fe3a9-9849-4219-badb-a0cecbb2a388-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b66fe3a9-9849-4219-badb-a0cecbb2a388" (UID: "b66fe3a9-9849-4219-badb-a0cecbb2a388"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.369451 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b66fe3a9-9849-4219-badb-a0cecbb2a388-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "b66fe3a9-9849-4219-badb-a0cecbb2a388" (UID: "b66fe3a9-9849-4219-badb-a0cecbb2a388"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.382424 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b66fe3a9-9849-4219-badb-a0cecbb2a388-logs\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.382450 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b66fe3a9-9849-4219-badb-a0cecbb2a388-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.382460 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b66fe3a9-9849-4219-badb-a0cecbb2a388-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.382470 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b66fe3a9-9849-4219-badb-a0cecbb2a388-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.382480 4825 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66fe3a9-9849-4219-badb-a0cecbb2a388-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.382489 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk2s7\" (UniqueName: \"kubernetes.io/projected/b66fe3a9-9849-4219-badb-a0cecbb2a388-kube-api-access-vk2s7\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.382497 4825 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b66fe3a9-9849-4219-badb-a0cecbb2a388-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.397632 4825 scope.go:117] "RemoveContainer" containerID="aea12da3989f209c9f5f62775f725a8aaf79a64c22af740047e00c937865a840" Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.437137 4825 scope.go:117] "RemoveContainer" containerID="b617ed135554f80f64aa57a58a8e8eeb06038dec9d1d88a161cbbda0bc1b3b20" Oct 07 19:19:46 crc kubenswrapper[4825]: E1007 19:19:46.437688 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b617ed135554f80f64aa57a58a8e8eeb06038dec9d1d88a161cbbda0bc1b3b20\": container with ID starting with b617ed135554f80f64aa57a58a8e8eeb06038dec9d1d88a161cbbda0bc1b3b20 not found: ID does not exist" containerID="b617ed135554f80f64aa57a58a8e8eeb06038dec9d1d88a161cbbda0bc1b3b20" Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.437722 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b617ed135554f80f64aa57a58a8e8eeb06038dec9d1d88a161cbbda0bc1b3b20"} err="failed to get container status \"b617ed135554f80f64aa57a58a8e8eeb06038dec9d1d88a161cbbda0bc1b3b20\": rpc error: code = NotFound desc = could not find container \"b617ed135554f80f64aa57a58a8e8eeb06038dec9d1d88a161cbbda0bc1b3b20\": container with ID starting with b617ed135554f80f64aa57a58a8e8eeb06038dec9d1d88a161cbbda0bc1b3b20 not found: ID does not exist" Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.437745 4825 scope.go:117] "RemoveContainer" containerID="aea12da3989f209c9f5f62775f725a8aaf79a64c22af740047e00c937865a840" Oct 07 19:19:46 crc kubenswrapper[4825]: E1007 19:19:46.438073 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aea12da3989f209c9f5f62775f725a8aaf79a64c22af740047e00c937865a840\": container with ID starting with aea12da3989f209c9f5f62775f725a8aaf79a64c22af740047e00c937865a840 not found: ID does not exist" containerID="aea12da3989f209c9f5f62775f725a8aaf79a64c22af740047e00c937865a840" Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.438100 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aea12da3989f209c9f5f62775f725a8aaf79a64c22af740047e00c937865a840"} err="failed to get container status \"aea12da3989f209c9f5f62775f725a8aaf79a64c22af740047e00c937865a840\": rpc error: code = NotFound desc = could not find container \"aea12da3989f209c9f5f62775f725a8aaf79a64c22af740047e00c937865a840\": container with ID starting with aea12da3989f209c9f5f62775f725a8aaf79a64c22af740047e00c937865a840 not found: ID does not exist" Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.585836 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-566c6c8d88-h74t9"] Oct 07 19:19:46 crc kubenswrapper[4825]: I1007 19:19:46.593835 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-566c6c8d88-h74t9"] Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.244604 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a6e447f-e2b4-45ec-999a-08b00d7fe14a","Type":"ContainerStarted","Data":"72532f7e3d3967f96da5361c983d5bb2182f296a4ac5546316a9d91cda8c42f4"} Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.245018 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a6e447f-e2b4-45ec-999a-08b00d7fe14a" containerName="ceilometer-central-agent" containerID="cri-o://12c8d43abe88e1b10f5cb2180cb77b8de0ddda2c786ed039ee853b2bb37e81cf" gracePeriod=30 Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.245114 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a6e447f-e2b4-45ec-999a-08b00d7fe14a" containerName="proxy-httpd" containerID="cri-o://72532f7e3d3967f96da5361c983d5bb2182f296a4ac5546316a9d91cda8c42f4" gracePeriod=30 Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.245130 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.245155 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a6e447f-e2b4-45ec-999a-08b00d7fe14a" containerName="sg-core" containerID="cri-o://317ff6948ba4ca27286b8513b49e1bc6ce6e12f0454f4f345abc1c8c36003d16" gracePeriod=30 Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.245194 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a6e447f-e2b4-45ec-999a-08b00d7fe14a" containerName="ceilometer-notification-agent" containerID="cri-o://af786fd2fe48c52347838052ac7346b57db05c49d101bb0f04ea95f9672f71d5" gracePeriod=30 Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.264264 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-2lwlm"] Oct 07 19:19:47 crc kubenswrapper[4825]: E1007 19:19:47.265207 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b66fe3a9-9849-4219-badb-a0cecbb2a388" containerName="horizon-log" Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.265329 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b66fe3a9-9849-4219-badb-a0cecbb2a388" containerName="horizon-log" Oct 07 19:19:47 crc kubenswrapper[4825]: E1007 19:19:47.265454 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b66fe3a9-9849-4219-badb-a0cecbb2a388" containerName="horizon" Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.265524 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b66fe3a9-9849-4219-badb-a0cecbb2a388" containerName="horizon" Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.265817 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b66fe3a9-9849-4219-badb-a0cecbb2a388" containerName="horizon-log" Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.265898 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b66fe3a9-9849-4219-badb-a0cecbb2a388" containerName="horizon" Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.266746 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2lwlm" Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.277314 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-2lwlm"] Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.280647 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.859379633 podStartE2EDuration="10.280629111s" podCreationTimestamp="2025-10-07 19:19:37 +0000 UTC" firstStartedPulling="2025-10-07 19:19:38.036201513 +0000 UTC m=+1166.858240150" lastFinishedPulling="2025-10-07 19:19:46.457450991 +0000 UTC m=+1175.279489628" observedRunningTime="2025-10-07 19:19:47.278661749 +0000 UTC m=+1176.100700386" watchObservedRunningTime="2025-10-07 19:19:47.280629111 +0000 UTC m=+1176.102667748" Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.359022 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-vnpfn"] Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.360182 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vnpfn" Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.369601 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-vnpfn"] Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.403014 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8mbx\" (UniqueName: \"kubernetes.io/projected/722d9cc6-daa7-4ca6-b795-93734a5d3c3c-kube-api-access-l8mbx\") pod \"nova-api-db-create-2lwlm\" (UID: \"722d9cc6-daa7-4ca6-b795-93734a5d3c3c\") " pod="openstack/nova-api-db-create-2lwlm" Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.452412 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-rqhll"] Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.453858 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rqhll" Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.459646 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-rqhll"] Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.504366 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b45fz\" (UniqueName: \"kubernetes.io/projected/fd8a68bd-3d55-4267-b003-773c5444996f-kube-api-access-b45fz\") pod \"nova-cell0-db-create-vnpfn\" (UID: \"fd8a68bd-3d55-4267-b003-773c5444996f\") " pod="openstack/nova-cell0-db-create-vnpfn" Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.504435 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8mbx\" (UniqueName: \"kubernetes.io/projected/722d9cc6-daa7-4ca6-b795-93734a5d3c3c-kube-api-access-l8mbx\") pod \"nova-api-db-create-2lwlm\" (UID: \"722d9cc6-daa7-4ca6-b795-93734a5d3c3c\") " pod="openstack/nova-api-db-create-2lwlm" Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.526410 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8mbx\" (UniqueName: \"kubernetes.io/projected/722d9cc6-daa7-4ca6-b795-93734a5d3c3c-kube-api-access-l8mbx\") pod \"nova-api-db-create-2lwlm\" (UID: \"722d9cc6-daa7-4ca6-b795-93734a5d3c3c\") " pod="openstack/nova-api-db-create-2lwlm" Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.606942 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrrwd\" (UniqueName: \"kubernetes.io/projected/9711a494-299c-48ba-9ec3-6cc8e2ede8f3-kube-api-access-mrrwd\") pod \"nova-cell1-db-create-rqhll\" (UID: \"9711a494-299c-48ba-9ec3-6cc8e2ede8f3\") " pod="openstack/nova-cell1-db-create-rqhll" Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.607674 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b45fz\" (UniqueName: \"kubernetes.io/projected/fd8a68bd-3d55-4267-b003-773c5444996f-kube-api-access-b45fz\") pod \"nova-cell0-db-create-vnpfn\" (UID: \"fd8a68bd-3d55-4267-b003-773c5444996f\") " pod="openstack/nova-cell0-db-create-vnpfn" Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.626385 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b45fz\" (UniqueName: \"kubernetes.io/projected/fd8a68bd-3d55-4267-b003-773c5444996f-kube-api-access-b45fz\") pod \"nova-cell0-db-create-vnpfn\" (UID: \"fd8a68bd-3d55-4267-b003-773c5444996f\") " pod="openstack/nova-cell0-db-create-vnpfn" Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.665474 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2lwlm" Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.709782 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrrwd\" (UniqueName: \"kubernetes.io/projected/9711a494-299c-48ba-9ec3-6cc8e2ede8f3-kube-api-access-mrrwd\") pod \"nova-cell1-db-create-rqhll\" (UID: \"9711a494-299c-48ba-9ec3-6cc8e2ede8f3\") " pod="openstack/nova-cell1-db-create-rqhll" Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.725302 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrrwd\" (UniqueName: \"kubernetes.io/projected/9711a494-299c-48ba-9ec3-6cc8e2ede8f3-kube-api-access-mrrwd\") pod \"nova-cell1-db-create-rqhll\" (UID: \"9711a494-299c-48ba-9ec3-6cc8e2ede8f3\") " pod="openstack/nova-cell1-db-create-rqhll" Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.729312 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vnpfn" Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.785432 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rqhll" Oct 07 19:19:47 crc kubenswrapper[4825]: I1007 19:19:47.810982 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b66fe3a9-9849-4219-badb-a0cecbb2a388" path="/var/lib/kubelet/pods/b66fe3a9-9849-4219-badb-a0cecbb2a388/volumes" Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.117830 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-2lwlm"] Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.250755 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-vnpfn"] Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.281312 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2lwlm" event={"ID":"722d9cc6-daa7-4ca6-b795-93734a5d3c3c","Type":"ContainerStarted","Data":"c5649052d75c7adaca733668f74628f3f9c55fbbe01b67597fb1e4ad1cb34df5"} Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.285961 4825 generic.go:334] "Generic (PLEG): container finished" podID="05e1cdf5-2de6-438e-b0f1-b05a7c1a2779" containerID="a4559148f6852c76e778a887208ad48edafe31301d7b04b04ef0e1b0ab844138" exitCode=0 Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.286072 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779","Type":"ContainerDied","Data":"a4559148f6852c76e778a887208ad48edafe31301d7b04b04ef0e1b0ab844138"} Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.297007 4825 generic.go:334] "Generic (PLEG): container finished" podID="4a6e447f-e2b4-45ec-999a-08b00d7fe14a" containerID="72532f7e3d3967f96da5361c983d5bb2182f296a4ac5546316a9d91cda8c42f4" exitCode=0 Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.297049 4825 generic.go:334] "Generic (PLEG): container finished" podID="4a6e447f-e2b4-45ec-999a-08b00d7fe14a" containerID="317ff6948ba4ca27286b8513b49e1bc6ce6e12f0454f4f345abc1c8c36003d16" exitCode=2 Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.297060 4825 generic.go:334] "Generic (PLEG): container finished" podID="4a6e447f-e2b4-45ec-999a-08b00d7fe14a" containerID="af786fd2fe48c52347838052ac7346b57db05c49d101bb0f04ea95f9672f71d5" exitCode=0 Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.297081 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a6e447f-e2b4-45ec-999a-08b00d7fe14a","Type":"ContainerDied","Data":"72532f7e3d3967f96da5361c983d5bb2182f296a4ac5546316a9d91cda8c42f4"} Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.297109 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a6e447f-e2b4-45ec-999a-08b00d7fe14a","Type":"ContainerDied","Data":"317ff6948ba4ca27286b8513b49e1bc6ce6e12f0454f4f345abc1c8c36003d16"} Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.297123 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a6e447f-e2b4-45ec-999a-08b00d7fe14a","Type":"ContainerDied","Data":"af786fd2fe48c52347838052ac7346b57db05c49d101bb0f04ea95f9672f71d5"} Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.334568 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-rqhll"] Oct 07 19:19:48 crc kubenswrapper[4825]: W1007 19:19:48.340756 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9711a494_299c_48ba_9ec3_6cc8e2ede8f3.slice/crio-ae34fb0624183efc59240ad6183fdf16da3461db8521b23d3b394fd8299868c7 WatchSource:0}: Error finding container ae34fb0624183efc59240ad6183fdf16da3461db8521b23d3b394fd8299868c7: Status 404 returned error can't find the container with id ae34fb0624183efc59240ad6183fdf16da3461db8521b23d3b394fd8299868c7 Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.821931 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.928444 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-config-data\") pod \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.928487 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-combined-ca-bundle\") pod \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.928553 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.928598 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-public-tls-certs\") pod \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.928647 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njdxs\" (UniqueName: \"kubernetes.io/projected/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-kube-api-access-njdxs\") pod \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.928693 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-scripts\") pod \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.928818 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-httpd-run\") pod \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.928853 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-logs\") pod \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\" (UID: \"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779\") " Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.929502 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "05e1cdf5-2de6-438e-b0f1-b05a7c1a2779" (UID: "05e1cdf5-2de6-438e-b0f1-b05a7c1a2779"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.929908 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-logs" (OuterVolumeSpecName: "logs") pod "05e1cdf5-2de6-438e-b0f1-b05a7c1a2779" (UID: "05e1cdf5-2de6-438e-b0f1-b05a7c1a2779"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.935860 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-scripts" (OuterVolumeSpecName: "scripts") pod "05e1cdf5-2de6-438e-b0f1-b05a7c1a2779" (UID: "05e1cdf5-2de6-438e-b0f1-b05a7c1a2779"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.935927 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-kube-api-access-njdxs" (OuterVolumeSpecName: "kube-api-access-njdxs") pod "05e1cdf5-2de6-438e-b0f1-b05a7c1a2779" (UID: "05e1cdf5-2de6-438e-b0f1-b05a7c1a2779"). InnerVolumeSpecName "kube-api-access-njdxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.935950 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "05e1cdf5-2de6-438e-b0f1-b05a7c1a2779" (UID: "05e1cdf5-2de6-438e-b0f1-b05a7c1a2779"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.961541 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05e1cdf5-2de6-438e-b0f1-b05a7c1a2779" (UID: "05e1cdf5-2de6-438e-b0f1-b05a7c1a2779"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.984479 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-config-data" (OuterVolumeSpecName: "config-data") pod "05e1cdf5-2de6-438e-b0f1-b05a7c1a2779" (UID: "05e1cdf5-2de6-438e-b0f1-b05a7c1a2779"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:48 crc kubenswrapper[4825]: I1007 19:19:48.993193 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "05e1cdf5-2de6-438e-b0f1-b05a7c1a2779" (UID: "05e1cdf5-2de6-438e-b0f1-b05a7c1a2779"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.031386 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njdxs\" (UniqueName: \"kubernetes.io/projected/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-kube-api-access-njdxs\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.031710 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.031725 4825 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.031740 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-logs\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.031751 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.031761 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.031800 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.031812 4825 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.058101 4825 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.133622 4825 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.306799 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"05e1cdf5-2de6-438e-b0f1-b05a7c1a2779","Type":"ContainerDied","Data":"c6955314d90e40469bf65a41f9d689ae831f8d71c765ac7f2f50190d048e3891"} Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.306867 4825 scope.go:117] "RemoveContainer" containerID="a4559148f6852c76e778a887208ad48edafe31301d7b04b04ef0e1b0ab844138" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.306871 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.313484 4825 generic.go:334] "Generic (PLEG): container finished" podID="fd8a68bd-3d55-4267-b003-773c5444996f" containerID="2fdc77b1ad585ca5f7b1429c1cc480627b46203aa0e26b61e4eada4faaef1da7" exitCode=0 Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.313641 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vnpfn" event={"ID":"fd8a68bd-3d55-4267-b003-773c5444996f","Type":"ContainerDied","Data":"2fdc77b1ad585ca5f7b1429c1cc480627b46203aa0e26b61e4eada4faaef1da7"} Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.313863 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vnpfn" event={"ID":"fd8a68bd-3d55-4267-b003-773c5444996f","Type":"ContainerStarted","Data":"11576a82f07cb1e441e8a99096d07f4a5dca0359224e4056f93e6f66fd7b95d6"} Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.321868 4825 generic.go:334] "Generic (PLEG): container finished" podID="9711a494-299c-48ba-9ec3-6cc8e2ede8f3" containerID="ae50acada5639c5d54ea3a51c1c5c4d876d000e9d74ecd86edfdf24796c4429d" exitCode=0 Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.321974 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-rqhll" event={"ID":"9711a494-299c-48ba-9ec3-6cc8e2ede8f3","Type":"ContainerDied","Data":"ae50acada5639c5d54ea3a51c1c5c4d876d000e9d74ecd86edfdf24796c4429d"} Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.322020 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-rqhll" event={"ID":"9711a494-299c-48ba-9ec3-6cc8e2ede8f3","Type":"ContainerStarted","Data":"ae34fb0624183efc59240ad6183fdf16da3461db8521b23d3b394fd8299868c7"} Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.335464 4825 generic.go:334] "Generic (PLEG): container finished" podID="722d9cc6-daa7-4ca6-b795-93734a5d3c3c" containerID="482db4f395495ed28699f29794a9f3ab54417a6cac5a29d96e47fed53f20bc66" exitCode=0 Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.335536 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2lwlm" event={"ID":"722d9cc6-daa7-4ca6-b795-93734a5d3c3c","Type":"ContainerDied","Data":"482db4f395495ed28699f29794a9f3ab54417a6cac5a29d96e47fed53f20bc66"} Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.376201 4825 scope.go:117] "RemoveContainer" containerID="26500b8f142625bd8a7d7547a151da17091c720ad58d1b05f0226cafe735907c" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.408391 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.416292 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.432165 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 19:19:49 crc kubenswrapper[4825]: E1007 19:19:49.432604 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05e1cdf5-2de6-438e-b0f1-b05a7c1a2779" containerName="glance-httpd" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.432625 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="05e1cdf5-2de6-438e-b0f1-b05a7c1a2779" containerName="glance-httpd" Oct 07 19:19:49 crc kubenswrapper[4825]: E1007 19:19:49.432648 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05e1cdf5-2de6-438e-b0f1-b05a7c1a2779" containerName="glance-log" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.432655 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="05e1cdf5-2de6-438e-b0f1-b05a7c1a2779" containerName="glance-log" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.432861 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="05e1cdf5-2de6-438e-b0f1-b05a7c1a2779" containerName="glance-log" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.432891 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="05e1cdf5-2de6-438e-b0f1-b05a7c1a2779" containerName="glance-httpd" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.433865 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.436405 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.436703 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.455464 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.541792 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") " pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.541872 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1356ee9f-f727-42b6-9a53-f80e78720704-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") " pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.541901 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1356ee9f-f727-42b6-9a53-f80e78720704-logs\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") " pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.542016 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1356ee9f-f727-42b6-9a53-f80e78720704-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") " pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.542043 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx6k6\" (UniqueName: \"kubernetes.io/projected/1356ee9f-f727-42b6-9a53-f80e78720704-kube-api-access-zx6k6\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") " pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.542069 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1356ee9f-f727-42b6-9a53-f80e78720704-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") " pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.542334 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1356ee9f-f727-42b6-9a53-f80e78720704-config-data\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") " pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.542393 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1356ee9f-f727-42b6-9a53-f80e78720704-scripts\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") " pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.644204 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") " pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.644286 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1356ee9f-f727-42b6-9a53-f80e78720704-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") " pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.644317 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1356ee9f-f727-42b6-9a53-f80e78720704-logs\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") " pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.644401 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1356ee9f-f727-42b6-9a53-f80e78720704-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") " pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.644428 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx6k6\" (UniqueName: \"kubernetes.io/projected/1356ee9f-f727-42b6-9a53-f80e78720704-kube-api-access-zx6k6\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") " pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.644446 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.645419 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1356ee9f-f727-42b6-9a53-f80e78720704-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") " pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.645667 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1356ee9f-f727-42b6-9a53-f80e78720704-logs\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") " pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.644455 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1356ee9f-f727-42b6-9a53-f80e78720704-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") " pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.646136 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1356ee9f-f727-42b6-9a53-f80e78720704-config-data\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") " pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.646166 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1356ee9f-f727-42b6-9a53-f80e78720704-scripts\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") " pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.655475 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1356ee9f-f727-42b6-9a53-f80e78720704-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") " pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.655922 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1356ee9f-f727-42b6-9a53-f80e78720704-config-data\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") " pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.657548 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1356ee9f-f727-42b6-9a53-f80e78720704-scripts\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") " pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.657820 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1356ee9f-f727-42b6-9a53-f80e78720704-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") " pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.666373 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx6k6\" (UniqueName: \"kubernetes.io/projected/1356ee9f-f727-42b6-9a53-f80e78720704-kube-api-access-zx6k6\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") " pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.686055 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"1356ee9f-f727-42b6-9a53-f80e78720704\") " pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.749176 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 19:19:49 crc kubenswrapper[4825]: I1007 19:19:49.808867 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05e1cdf5-2de6-438e-b0f1-b05a7c1a2779" path="/var/lib/kubelet/pods/05e1cdf5-2de6-438e-b0f1-b05a7c1a2779/volumes" Oct 07 19:19:50 crc kubenswrapper[4825]: I1007 19:19:50.324985 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 19:19:50 crc kubenswrapper[4825]: W1007 19:19:50.333297 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1356ee9f_f727_42b6_9a53_f80e78720704.slice/crio-ec66b2fe650d334122e8a754631ded589f8e6509c08c6008849281d9ce35616f WatchSource:0}: Error finding container ec66b2fe650d334122e8a754631ded589f8e6509c08c6008849281d9ce35616f: Status 404 returned error can't find the container with id ec66b2fe650d334122e8a754631ded589f8e6509c08c6008849281d9ce35616f Oct 07 19:19:50 crc kubenswrapper[4825]: I1007 19:19:50.347310 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1356ee9f-f727-42b6-9a53-f80e78720704","Type":"ContainerStarted","Data":"ec66b2fe650d334122e8a754631ded589f8e6509c08c6008849281d9ce35616f"} Oct 07 19:19:50 crc kubenswrapper[4825]: I1007 19:19:50.706855 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2lwlm" Oct 07 19:19:50 crc kubenswrapper[4825]: I1007 19:19:50.787031 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vnpfn" Oct 07 19:19:50 crc kubenswrapper[4825]: I1007 19:19:50.802754 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rqhll" Oct 07 19:19:50 crc kubenswrapper[4825]: I1007 19:19:50.879274 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrrwd\" (UniqueName: \"kubernetes.io/projected/9711a494-299c-48ba-9ec3-6cc8e2ede8f3-kube-api-access-mrrwd\") pod \"9711a494-299c-48ba-9ec3-6cc8e2ede8f3\" (UID: \"9711a494-299c-48ba-9ec3-6cc8e2ede8f3\") " Oct 07 19:19:50 crc kubenswrapper[4825]: I1007 19:19:50.879423 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b45fz\" (UniqueName: \"kubernetes.io/projected/fd8a68bd-3d55-4267-b003-773c5444996f-kube-api-access-b45fz\") pod \"fd8a68bd-3d55-4267-b003-773c5444996f\" (UID: \"fd8a68bd-3d55-4267-b003-773c5444996f\") " Oct 07 19:19:50 crc kubenswrapper[4825]: I1007 19:19:50.879557 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8mbx\" (UniqueName: \"kubernetes.io/projected/722d9cc6-daa7-4ca6-b795-93734a5d3c3c-kube-api-access-l8mbx\") pod \"722d9cc6-daa7-4ca6-b795-93734a5d3c3c\" (UID: \"722d9cc6-daa7-4ca6-b795-93734a5d3c3c\") " Oct 07 19:19:50 crc kubenswrapper[4825]: I1007 19:19:50.886599 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/722d9cc6-daa7-4ca6-b795-93734a5d3c3c-kube-api-access-l8mbx" (OuterVolumeSpecName: "kube-api-access-l8mbx") pod "722d9cc6-daa7-4ca6-b795-93734a5d3c3c" (UID: "722d9cc6-daa7-4ca6-b795-93734a5d3c3c"). InnerVolumeSpecName "kube-api-access-l8mbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:19:50 crc kubenswrapper[4825]: I1007 19:19:50.886828 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd8a68bd-3d55-4267-b003-773c5444996f-kube-api-access-b45fz" (OuterVolumeSpecName: "kube-api-access-b45fz") pod "fd8a68bd-3d55-4267-b003-773c5444996f" (UID: "fd8a68bd-3d55-4267-b003-773c5444996f"). InnerVolumeSpecName "kube-api-access-b45fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:19:50 crc kubenswrapper[4825]: I1007 19:19:50.886949 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9711a494-299c-48ba-9ec3-6cc8e2ede8f3-kube-api-access-mrrwd" (OuterVolumeSpecName: "kube-api-access-mrrwd") pod "9711a494-299c-48ba-9ec3-6cc8e2ede8f3" (UID: "9711a494-299c-48ba-9ec3-6cc8e2ede8f3"). InnerVolumeSpecName "kube-api-access-mrrwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:19:50 crc kubenswrapper[4825]: I1007 19:19:50.981552 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8mbx\" (UniqueName: \"kubernetes.io/projected/722d9cc6-daa7-4ca6-b795-93734a5d3c3c-kube-api-access-l8mbx\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:50 crc kubenswrapper[4825]: I1007 19:19:50.981591 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrrwd\" (UniqueName: \"kubernetes.io/projected/9711a494-299c-48ba-9ec3-6cc8e2ede8f3-kube-api-access-mrrwd\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:50 crc kubenswrapper[4825]: I1007 19:19:50.981602 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b45fz\" (UniqueName: \"kubernetes.io/projected/fd8a68bd-3d55-4267-b003-773c5444996f-kube-api-access-b45fz\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:51 crc kubenswrapper[4825]: I1007 19:19:51.009400 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 19:19:51 crc kubenswrapper[4825]: I1007 19:19:51.009690 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="20980f0a-4148-4983-8991-ea4563cfbc5a" containerName="glance-httpd" containerID="cri-o://4ba4f194e908f0920dc5ec7deffa30535c2f100a35d0955b8887349efca9d864" gracePeriod=30 Oct 07 19:19:51 crc kubenswrapper[4825]: I1007 19:19:51.011087 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="20980f0a-4148-4983-8991-ea4563cfbc5a" containerName="glance-log" containerID="cri-o://fbacb8c9a04b1fa2222ab53c98163452b1e70381c82894595819fc1118db2b62" gracePeriod=30 Oct 07 19:19:51 crc kubenswrapper[4825]: E1007 19:19:51.268160 4825 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20980f0a_4148_4983_8991_ea4563cfbc5a.slice/crio-fbacb8c9a04b1fa2222ab53c98163452b1e70381c82894595819fc1118db2b62.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20980f0a_4148_4983_8991_ea4563cfbc5a.slice/crio-conmon-fbacb8c9a04b1fa2222ab53c98163452b1e70381c82894595819fc1118db2b62.scope\": RecentStats: unable to find data in memory cache]" Oct 07 19:19:51 crc kubenswrapper[4825]: I1007 19:19:51.381668 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1356ee9f-f727-42b6-9a53-f80e78720704","Type":"ContainerStarted","Data":"78d7cffa27ee3635ca98eca7ffd21554c856109c093e8378a070d40815ca1451"} Oct 07 19:19:51 crc kubenswrapper[4825]: I1007 19:19:51.384558 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vnpfn" event={"ID":"fd8a68bd-3d55-4267-b003-773c5444996f","Type":"ContainerDied","Data":"11576a82f07cb1e441e8a99096d07f4a5dca0359224e4056f93e6f66fd7b95d6"} Oct 07 19:19:51 crc kubenswrapper[4825]: I1007 19:19:51.384592 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vnpfn" Oct 07 19:19:51 crc kubenswrapper[4825]: I1007 19:19:51.384694 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11576a82f07cb1e441e8a99096d07f4a5dca0359224e4056f93e6f66fd7b95d6" Oct 07 19:19:51 crc kubenswrapper[4825]: I1007 19:19:51.388248 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-rqhll" event={"ID":"9711a494-299c-48ba-9ec3-6cc8e2ede8f3","Type":"ContainerDied","Data":"ae34fb0624183efc59240ad6183fdf16da3461db8521b23d3b394fd8299868c7"} Oct 07 19:19:51 crc kubenswrapper[4825]: I1007 19:19:51.388289 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae34fb0624183efc59240ad6183fdf16da3461db8521b23d3b394fd8299868c7" Oct 07 19:19:51 crc kubenswrapper[4825]: I1007 19:19:51.388364 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rqhll" Oct 07 19:19:51 crc kubenswrapper[4825]: I1007 19:19:51.391105 4825 generic.go:334] "Generic (PLEG): container finished" podID="20980f0a-4148-4983-8991-ea4563cfbc5a" containerID="fbacb8c9a04b1fa2222ab53c98163452b1e70381c82894595819fc1118db2b62" exitCode=143 Oct 07 19:19:51 crc kubenswrapper[4825]: I1007 19:19:51.391166 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"20980f0a-4148-4983-8991-ea4563cfbc5a","Type":"ContainerDied","Data":"fbacb8c9a04b1fa2222ab53c98163452b1e70381c82894595819fc1118db2b62"} Oct 07 19:19:51 crc kubenswrapper[4825]: I1007 19:19:51.393293 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2lwlm" event={"ID":"722d9cc6-daa7-4ca6-b795-93734a5d3c3c","Type":"ContainerDied","Data":"c5649052d75c7adaca733668f74628f3f9c55fbbe01b67597fb1e4ad1cb34df5"} Oct 07 19:19:51 crc kubenswrapper[4825]: I1007 19:19:51.393312 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5649052d75c7adaca733668f74628f3f9c55fbbe01b67597fb1e4ad1cb34df5" Oct 07 19:19:51 crc kubenswrapper[4825]: I1007 19:19:51.393370 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2lwlm" Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.413075 4825 generic.go:334] "Generic (PLEG): container finished" podID="4a6e447f-e2b4-45ec-999a-08b00d7fe14a" containerID="12c8d43abe88e1b10f5cb2180cb77b8de0ddda2c786ed039ee853b2bb37e81cf" exitCode=0 Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.418183 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a6e447f-e2b4-45ec-999a-08b00d7fe14a","Type":"ContainerDied","Data":"12c8d43abe88e1b10f5cb2180cb77b8de0ddda2c786ed039ee853b2bb37e81cf"} Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.421286 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1356ee9f-f727-42b6-9a53-f80e78720704","Type":"ContainerStarted","Data":"394e2956d92b342d9f01a593e6cd1406eb1fe817e35835b43243fd6700879c31"} Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.440428 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.4404002399999998 podStartE2EDuration="3.44040024s" podCreationTimestamp="2025-10-07 19:19:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:19:52.436426943 +0000 UTC m=+1181.258465580" watchObservedRunningTime="2025-10-07 19:19:52.44040024 +0000 UTC m=+1181.262438877" Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.713070 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.812150 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-run-httpd\") pod \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.812207 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-log-httpd\") pod \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.812346 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-combined-ca-bundle\") pod \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.812418 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-config-data\") pod \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.812444 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-sg-core-conf-yaml\") pod \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.812489 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-scripts\") pod \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.812524 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lrq7\" (UniqueName: \"kubernetes.io/projected/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-kube-api-access-2lrq7\") pod \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\" (UID: \"4a6e447f-e2b4-45ec-999a-08b00d7fe14a\") " Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.813626 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4a6e447f-e2b4-45ec-999a-08b00d7fe14a" (UID: "4a6e447f-e2b4-45ec-999a-08b00d7fe14a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.814328 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4a6e447f-e2b4-45ec-999a-08b00d7fe14a" (UID: "4a6e447f-e2b4-45ec-999a-08b00d7fe14a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.819136 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-kube-api-access-2lrq7" (OuterVolumeSpecName: "kube-api-access-2lrq7") pod "4a6e447f-e2b4-45ec-999a-08b00d7fe14a" (UID: "4a6e447f-e2b4-45ec-999a-08b00d7fe14a"). InnerVolumeSpecName "kube-api-access-2lrq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.820198 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-scripts" (OuterVolumeSpecName: "scripts") pod "4a6e447f-e2b4-45ec-999a-08b00d7fe14a" (UID: "4a6e447f-e2b4-45ec-999a-08b00d7fe14a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.850261 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4a6e447f-e2b4-45ec-999a-08b00d7fe14a" (UID: "4a6e447f-e2b4-45ec-999a-08b00d7fe14a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.904450 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a6e447f-e2b4-45ec-999a-08b00d7fe14a" (UID: "4a6e447f-e2b4-45ec-999a-08b00d7fe14a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.914938 4825 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.914963 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.914973 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lrq7\" (UniqueName: \"kubernetes.io/projected/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-kube-api-access-2lrq7\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.914982 4825 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.914995 4825 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.915004 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:52 crc kubenswrapper[4825]: I1007 19:19:52.946622 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-config-data" (OuterVolumeSpecName: "config-data") pod "4a6e447f-e2b4-45ec-999a-08b00d7fe14a" (UID: "4a6e447f-e2b4-45ec-999a-08b00d7fe14a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.017127 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a6e447f-e2b4-45ec-999a-08b00d7fe14a-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.433941 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a6e447f-e2b4-45ec-999a-08b00d7fe14a","Type":"ContainerDied","Data":"11bd60d957ebd651bcaeca3b0c605bab249e9502d10596daef471538f98a031c"} Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.434005 4825 scope.go:117] "RemoveContainer" containerID="72532f7e3d3967f96da5361c983d5bb2182f296a4ac5546316a9d91cda8c42f4" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.434043 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.457309 4825 scope.go:117] "RemoveContainer" containerID="317ff6948ba4ca27286b8513b49e1bc6ce6e12f0454f4f345abc1c8c36003d16" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.484216 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.494117 4825 scope.go:117] "RemoveContainer" containerID="af786fd2fe48c52347838052ac7346b57db05c49d101bb0f04ea95f9672f71d5" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.501951 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.512826 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:19:53 crc kubenswrapper[4825]: E1007 19:19:53.513268 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd8a68bd-3d55-4267-b003-773c5444996f" containerName="mariadb-database-create" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.513290 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd8a68bd-3d55-4267-b003-773c5444996f" containerName="mariadb-database-create" Oct 07 19:19:53 crc kubenswrapper[4825]: E1007 19:19:53.513316 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6e447f-e2b4-45ec-999a-08b00d7fe14a" containerName="proxy-httpd" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.513324 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6e447f-e2b4-45ec-999a-08b00d7fe14a" containerName="proxy-httpd" Oct 07 19:19:53 crc kubenswrapper[4825]: E1007 19:19:53.513338 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6e447f-e2b4-45ec-999a-08b00d7fe14a" containerName="ceilometer-notification-agent" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.513345 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6e447f-e2b4-45ec-999a-08b00d7fe14a" containerName="ceilometer-notification-agent" Oct 07 19:19:53 crc kubenswrapper[4825]: E1007 19:19:53.513354 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6e447f-e2b4-45ec-999a-08b00d7fe14a" containerName="ceilometer-central-agent" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.513359 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6e447f-e2b4-45ec-999a-08b00d7fe14a" containerName="ceilometer-central-agent" Oct 07 19:19:53 crc kubenswrapper[4825]: E1007 19:19:53.513371 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="722d9cc6-daa7-4ca6-b795-93734a5d3c3c" containerName="mariadb-database-create" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.513377 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="722d9cc6-daa7-4ca6-b795-93734a5d3c3c" containerName="mariadb-database-create" Oct 07 19:19:53 crc kubenswrapper[4825]: E1007 19:19:53.513386 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9711a494-299c-48ba-9ec3-6cc8e2ede8f3" containerName="mariadb-database-create" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.513393 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="9711a494-299c-48ba-9ec3-6cc8e2ede8f3" containerName="mariadb-database-create" Oct 07 19:19:53 crc kubenswrapper[4825]: E1007 19:19:53.513415 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6e447f-e2b4-45ec-999a-08b00d7fe14a" containerName="sg-core" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.513421 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6e447f-e2b4-45ec-999a-08b00d7fe14a" containerName="sg-core" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.513597 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="722d9cc6-daa7-4ca6-b795-93734a5d3c3c" containerName="mariadb-database-create" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.513611 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a6e447f-e2b4-45ec-999a-08b00d7fe14a" containerName="ceilometer-central-agent" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.513622 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="9711a494-299c-48ba-9ec3-6cc8e2ede8f3" containerName="mariadb-database-create" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.513629 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd8a68bd-3d55-4267-b003-773c5444996f" containerName="mariadb-database-create" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.513642 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a6e447f-e2b4-45ec-999a-08b00d7fe14a" containerName="sg-core" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.513654 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a6e447f-e2b4-45ec-999a-08b00d7fe14a" containerName="ceilometer-notification-agent" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.513664 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a6e447f-e2b4-45ec-999a-08b00d7fe14a" containerName="proxy-httpd" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.515308 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.518802 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.518984 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.519144 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.538154 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.539881 4825 scope.go:117] "RemoveContainer" containerID="12c8d43abe88e1b10f5cb2180cb77b8de0ddda2c786ed039ee853b2bb37e81cf" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.627003 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40b6f4a6-be77-4451-b3a9-f8243334c779-run-httpd\") pod \"ceilometer-0\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.627298 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-scripts\") pod \"ceilometer-0\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.627333 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.627395 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57jp5\" (UniqueName: \"kubernetes.io/projected/40b6f4a6-be77-4451-b3a9-f8243334c779-kube-api-access-57jp5\") pod \"ceilometer-0\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.627417 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40b6f4a6-be77-4451-b3a9-f8243334c779-log-httpd\") pod \"ceilometer-0\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.627432 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.627466 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-config-data\") pod \"ceilometer-0\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.627513 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.729329 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-scripts\") pod \"ceilometer-0\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.729404 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.729439 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57jp5\" (UniqueName: \"kubernetes.io/projected/40b6f4a6-be77-4451-b3a9-f8243334c779-kube-api-access-57jp5\") pod \"ceilometer-0\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.729462 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40b6f4a6-be77-4451-b3a9-f8243334c779-log-httpd\") pod \"ceilometer-0\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.729480 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.729516 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-config-data\") pod \"ceilometer-0\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.729563 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.729610 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40b6f4a6-be77-4451-b3a9-f8243334c779-run-httpd\") pod \"ceilometer-0\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.730152 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40b6f4a6-be77-4451-b3a9-f8243334c779-log-httpd\") pod \"ceilometer-0\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.730275 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40b6f4a6-be77-4451-b3a9-f8243334c779-run-httpd\") pod \"ceilometer-0\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.734116 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-scripts\") pod \"ceilometer-0\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.735410 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.735501 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.742273 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.742446 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-config-data\") pod \"ceilometer-0\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.748809 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57jp5\" (UniqueName: \"kubernetes.io/projected/40b6f4a6-be77-4451-b3a9-f8243334c779-kube-api-access-57jp5\") pod \"ceilometer-0\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " pod="openstack/ceilometer-0" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.806747 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a6e447f-e2b4-45ec-999a-08b00d7fe14a" path="/var/lib/kubelet/pods/4a6e447f-e2b4-45ec-999a-08b00d7fe14a/volumes" Oct 07 19:19:53 crc kubenswrapper[4825]: I1007 19:19:53.844295 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.257294 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:19:54 crc kubenswrapper[4825]: W1007 19:19:54.262151 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40b6f4a6_be77_4451_b3a9_f8243334c779.slice/crio-e41177f96bc30fe871c24a908bd0cf4f17b39b08ca404d7f3da60a0c78f52637 WatchSource:0}: Error finding container e41177f96bc30fe871c24a908bd0cf4f17b39b08ca404d7f3da60a0c78f52637: Status 404 returned error can't find the container with id e41177f96bc30fe871c24a908bd0cf4f17b39b08ca404d7f3da60a0c78f52637 Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.445847 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40b6f4a6-be77-4451-b3a9-f8243334c779","Type":"ContainerStarted","Data":"e41177f96bc30fe871c24a908bd0cf4f17b39b08ca404d7f3da60a0c78f52637"} Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.447883 4825 generic.go:334] "Generic (PLEG): container finished" podID="20980f0a-4148-4983-8991-ea4563cfbc5a" containerID="4ba4f194e908f0920dc5ec7deffa30535c2f100a35d0955b8887349efca9d864" exitCode=0 Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.447930 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"20980f0a-4148-4983-8991-ea4563cfbc5a","Type":"ContainerDied","Data":"4ba4f194e908f0920dc5ec7deffa30535c2f100a35d0955b8887349efca9d864"} Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.602309 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.757429 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20980f0a-4148-4983-8991-ea4563cfbc5a-internal-tls-certs\") pod \"20980f0a-4148-4983-8991-ea4563cfbc5a\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.757684 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20980f0a-4148-4983-8991-ea4563cfbc5a-combined-ca-bundle\") pod \"20980f0a-4148-4983-8991-ea4563cfbc5a\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.757703 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20980f0a-4148-4983-8991-ea4563cfbc5a-config-data\") pod \"20980f0a-4148-4983-8991-ea4563cfbc5a\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.757757 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20980f0a-4148-4983-8991-ea4563cfbc5a-logs\") pod \"20980f0a-4148-4983-8991-ea4563cfbc5a\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.757810 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20980f0a-4148-4983-8991-ea4563cfbc5a-scripts\") pod \"20980f0a-4148-4983-8991-ea4563cfbc5a\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.757860 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlnlq\" (UniqueName: \"kubernetes.io/projected/20980f0a-4148-4983-8991-ea4563cfbc5a-kube-api-access-qlnlq\") pod \"20980f0a-4148-4983-8991-ea4563cfbc5a\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.757877 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"20980f0a-4148-4983-8991-ea4563cfbc5a\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.757944 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20980f0a-4148-4983-8991-ea4563cfbc5a-httpd-run\") pod \"20980f0a-4148-4983-8991-ea4563cfbc5a\" (UID: \"20980f0a-4148-4983-8991-ea4563cfbc5a\") " Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.758551 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20980f0a-4148-4983-8991-ea4563cfbc5a-logs" (OuterVolumeSpecName: "logs") pod "20980f0a-4148-4983-8991-ea4563cfbc5a" (UID: "20980f0a-4148-4983-8991-ea4563cfbc5a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.758660 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20980f0a-4148-4983-8991-ea4563cfbc5a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "20980f0a-4148-4983-8991-ea4563cfbc5a" (UID: "20980f0a-4148-4983-8991-ea4563cfbc5a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.764814 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20980f0a-4148-4983-8991-ea4563cfbc5a-scripts" (OuterVolumeSpecName: "scripts") pod "20980f0a-4148-4983-8991-ea4563cfbc5a" (UID: "20980f0a-4148-4983-8991-ea4563cfbc5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.765679 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20980f0a-4148-4983-8991-ea4563cfbc5a-kube-api-access-qlnlq" (OuterVolumeSpecName: "kube-api-access-qlnlq") pod "20980f0a-4148-4983-8991-ea4563cfbc5a" (UID: "20980f0a-4148-4983-8991-ea4563cfbc5a"). InnerVolumeSpecName "kube-api-access-qlnlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.765990 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "20980f0a-4148-4983-8991-ea4563cfbc5a" (UID: "20980f0a-4148-4983-8991-ea4563cfbc5a"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.787426 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20980f0a-4148-4983-8991-ea4563cfbc5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20980f0a-4148-4983-8991-ea4563cfbc5a" (UID: "20980f0a-4148-4983-8991-ea4563cfbc5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.807154 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20980f0a-4148-4983-8991-ea4563cfbc5a-config-data" (OuterVolumeSpecName: "config-data") pod "20980f0a-4148-4983-8991-ea4563cfbc5a" (UID: "20980f0a-4148-4983-8991-ea4563cfbc5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.830861 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20980f0a-4148-4983-8991-ea4563cfbc5a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "20980f0a-4148-4983-8991-ea4563cfbc5a" (UID: "20980f0a-4148-4983-8991-ea4563cfbc5a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.860481 4825 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20980f0a-4148-4983-8991-ea4563cfbc5a-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.860519 4825 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20980f0a-4148-4983-8991-ea4563cfbc5a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.860534 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20980f0a-4148-4983-8991-ea4563cfbc5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.860548 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20980f0a-4148-4983-8991-ea4563cfbc5a-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.860559 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20980f0a-4148-4983-8991-ea4563cfbc5a-logs\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.860569 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20980f0a-4148-4983-8991-ea4563cfbc5a-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.860606 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlnlq\" (UniqueName: \"kubernetes.io/projected/20980f0a-4148-4983-8991-ea4563cfbc5a-kube-api-access-qlnlq\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.860649 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.887492 4825 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 07 19:19:54 crc kubenswrapper[4825]: I1007 19:19:54.962432 4825 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.461569 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40b6f4a6-be77-4451-b3a9-f8243334c779","Type":"ContainerStarted","Data":"9820a85ac927bd697e1894c6845ba87cd82f90bce6c3b848c17122a5ace374b5"} Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.464543 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"20980f0a-4148-4983-8991-ea4563cfbc5a","Type":"ContainerDied","Data":"d07a43b2fd4e4ac4aa3f7ae6ea7ab2106945c03180672d8d8bd130188e5beed7"} Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.464576 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.464611 4825 scope.go:117] "RemoveContainer" containerID="4ba4f194e908f0920dc5ec7deffa30535c2f100a35d0955b8887349efca9d864" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.506684 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.528260 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.532331 4825 scope.go:117] "RemoveContainer" containerID="fbacb8c9a04b1fa2222ab53c98163452b1e70381c82894595819fc1118db2b62" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.542376 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 19:19:55 crc kubenswrapper[4825]: E1007 19:19:55.542855 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20980f0a-4148-4983-8991-ea4563cfbc5a" containerName="glance-httpd" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.542871 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="20980f0a-4148-4983-8991-ea4563cfbc5a" containerName="glance-httpd" Oct 07 19:19:55 crc kubenswrapper[4825]: E1007 19:19:55.542913 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20980f0a-4148-4983-8991-ea4563cfbc5a" containerName="glance-log" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.542921 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="20980f0a-4148-4983-8991-ea4563cfbc5a" containerName="glance-log" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.543134 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="20980f0a-4148-4983-8991-ea4563cfbc5a" containerName="glance-httpd" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.543156 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="20980f0a-4148-4983-8991-ea4563cfbc5a" containerName="glance-log" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.544281 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.547935 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.548186 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.568053 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.674378 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/747f4079-112d-4889-937f-fc39c9d75819-logs\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.674418 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/747f4079-112d-4889-937f-fc39c9d75819-config-data\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.674614 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/747f4079-112d-4889-937f-fc39c9d75819-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.674668 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.674710 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/747f4079-112d-4889-937f-fc39c9d75819-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.674854 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/747f4079-112d-4889-937f-fc39c9d75819-scripts\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.674954 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wjmg\" (UniqueName: \"kubernetes.io/projected/747f4079-112d-4889-937f-fc39c9d75819-kube-api-access-2wjmg\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.675005 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747f4079-112d-4889-937f-fc39c9d75819-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.776897 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wjmg\" (UniqueName: \"kubernetes.io/projected/747f4079-112d-4889-937f-fc39c9d75819-kube-api-access-2wjmg\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.776950 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747f4079-112d-4889-937f-fc39c9d75819-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.776985 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/747f4079-112d-4889-937f-fc39c9d75819-logs\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.777005 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/747f4079-112d-4889-937f-fc39c9d75819-config-data\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.777065 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/747f4079-112d-4889-937f-fc39c9d75819-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.777085 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.777102 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/747f4079-112d-4889-937f-fc39c9d75819-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.777149 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/747f4079-112d-4889-937f-fc39c9d75819-scripts\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.777724 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/747f4079-112d-4889-937f-fc39c9d75819-logs\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.777881 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.777959 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/747f4079-112d-4889-937f-fc39c9d75819-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.783891 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/747f4079-112d-4889-937f-fc39c9d75819-config-data\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.784508 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/747f4079-112d-4889-937f-fc39c9d75819-scripts\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.786422 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/747f4079-112d-4889-937f-fc39c9d75819-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.787872 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747f4079-112d-4889-937f-fc39c9d75819-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.801600 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wjmg\" (UniqueName: \"kubernetes.io/projected/747f4079-112d-4889-937f-fc39c9d75819-kube-api-access-2wjmg\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.805461 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20980f0a-4148-4983-8991-ea4563cfbc5a" path="/var/lib/kubelet/pods/20980f0a-4148-4983-8991-ea4563cfbc5a/volumes" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.813961 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"747f4079-112d-4889-937f-fc39c9d75819\") " pod="openstack/glance-default-internal-api-0" Oct 07 19:19:55 crc kubenswrapper[4825]: I1007 19:19:55.878725 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 19:19:56 crc kubenswrapper[4825]: I1007 19:19:56.428549 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 19:19:56 crc kubenswrapper[4825]: W1007 19:19:56.430947 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod747f4079_112d_4889_937f_fc39c9d75819.slice/crio-2520164345e4fc73200c4b2534dd66fbc2d21ef80f0cccf4b6ecf0d5caab8323 WatchSource:0}: Error finding container 2520164345e4fc73200c4b2534dd66fbc2d21ef80f0cccf4b6ecf0d5caab8323: Status 404 returned error can't find the container with id 2520164345e4fc73200c4b2534dd66fbc2d21ef80f0cccf4b6ecf0d5caab8323 Oct 07 19:19:56 crc kubenswrapper[4825]: I1007 19:19:56.475260 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"747f4079-112d-4889-937f-fc39c9d75819","Type":"ContainerStarted","Data":"2520164345e4fc73200c4b2534dd66fbc2d21ef80f0cccf4b6ecf0d5caab8323"} Oct 07 19:19:56 crc kubenswrapper[4825]: I1007 19:19:56.477732 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40b6f4a6-be77-4451-b3a9-f8243334c779","Type":"ContainerStarted","Data":"9bef7754332b0e4c5492d59e6d877e3468d43099a4c18207670272fa4ee97332"} Oct 07 19:19:56 crc kubenswrapper[4825]: I1007 19:19:56.477767 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40b6f4a6-be77-4451-b3a9-f8243334c779","Type":"ContainerStarted","Data":"bb78dce91a684275a08ecbbbe64b95939eb22930bd1c9fbba4f6f0dab445d4ea"} Oct 07 19:19:57 crc kubenswrapper[4825]: I1007 19:19:57.432272 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-e266-account-create-crc5b"] Oct 07 19:19:57 crc kubenswrapper[4825]: I1007 19:19:57.434165 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e266-account-create-crc5b" Oct 07 19:19:57 crc kubenswrapper[4825]: I1007 19:19:57.438551 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 07 19:19:57 crc kubenswrapper[4825]: I1007 19:19:57.441014 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e266-account-create-crc5b"] Oct 07 19:19:57 crc kubenswrapper[4825]: I1007 19:19:57.512043 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"747f4079-112d-4889-937f-fc39c9d75819","Type":"ContainerStarted","Data":"c2ee5a5264318e08223015a52ef57e30e2bdbf2bd21343a04ee4234b19aed371"} Oct 07 19:19:57 crc kubenswrapper[4825]: I1007 19:19:57.512099 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"747f4079-112d-4889-937f-fc39c9d75819","Type":"ContainerStarted","Data":"80c25f53dcafa990d5a467b0efb7076de4a9757870d1ccd07f358a8adafd46bf"} Oct 07 19:19:57 crc kubenswrapper[4825]: I1007 19:19:57.512664 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztgqg\" (UniqueName: \"kubernetes.io/projected/d6310f91-ae90-4fd6-a4a5-51a43304420a-kube-api-access-ztgqg\") pod \"nova-api-e266-account-create-crc5b\" (UID: \"d6310f91-ae90-4fd6-a4a5-51a43304420a\") " pod="openstack/nova-api-e266-account-create-crc5b" Oct 07 19:19:57 crc kubenswrapper[4825]: I1007 19:19:57.533055 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.533038157 podStartE2EDuration="2.533038157s" podCreationTimestamp="2025-10-07 19:19:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:19:57.530671272 +0000 UTC m=+1186.352709919" watchObservedRunningTime="2025-10-07 19:19:57.533038157 +0000 UTC m=+1186.355076794" Oct 07 19:19:57 crc kubenswrapper[4825]: I1007 19:19:57.627776 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-f3e1-account-create-psn2x"] Oct 07 19:19:57 crc kubenswrapper[4825]: I1007 19:19:57.627962 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztgqg\" (UniqueName: \"kubernetes.io/projected/d6310f91-ae90-4fd6-a4a5-51a43304420a-kube-api-access-ztgqg\") pod \"nova-api-e266-account-create-crc5b\" (UID: \"d6310f91-ae90-4fd6-a4a5-51a43304420a\") " pod="openstack/nova-api-e266-account-create-crc5b" Oct 07 19:19:57 crc kubenswrapper[4825]: I1007 19:19:57.629654 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f3e1-account-create-psn2x" Oct 07 19:19:57 crc kubenswrapper[4825]: I1007 19:19:57.640517 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 07 19:19:57 crc kubenswrapper[4825]: I1007 19:19:57.647511 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztgqg\" (UniqueName: \"kubernetes.io/projected/d6310f91-ae90-4fd6-a4a5-51a43304420a-kube-api-access-ztgqg\") pod \"nova-api-e266-account-create-crc5b\" (UID: \"d6310f91-ae90-4fd6-a4a5-51a43304420a\") " pod="openstack/nova-api-e266-account-create-crc5b" Oct 07 19:19:57 crc kubenswrapper[4825]: I1007 19:19:57.662252 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-f3e1-account-create-psn2x"] Oct 07 19:19:57 crc kubenswrapper[4825]: I1007 19:19:57.729751 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4hgb\" (UniqueName: \"kubernetes.io/projected/676ed7ae-5a2c-4680-9138-81a1cf620b00-kube-api-access-x4hgb\") pod \"nova-cell0-f3e1-account-create-psn2x\" (UID: \"676ed7ae-5a2c-4680-9138-81a1cf620b00\") " pod="openstack/nova-cell0-f3e1-account-create-psn2x" Oct 07 19:19:57 crc kubenswrapper[4825]: I1007 19:19:57.757517 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e266-account-create-crc5b" Oct 07 19:19:57 crc kubenswrapper[4825]: I1007 19:19:57.826417 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-d2e4-account-create-6vqxn"] Oct 07 19:19:57 crc kubenswrapper[4825]: I1007 19:19:57.827937 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d2e4-account-create-6vqxn" Oct 07 19:19:57 crc kubenswrapper[4825]: I1007 19:19:57.830526 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 07 19:19:57 crc kubenswrapper[4825]: I1007 19:19:57.836666 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d2e4-account-create-6vqxn"] Oct 07 19:19:57 crc kubenswrapper[4825]: I1007 19:19:57.845010 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4hgb\" (UniqueName: \"kubernetes.io/projected/676ed7ae-5a2c-4680-9138-81a1cf620b00-kube-api-access-x4hgb\") pod \"nova-cell0-f3e1-account-create-psn2x\" (UID: \"676ed7ae-5a2c-4680-9138-81a1cf620b00\") " pod="openstack/nova-cell0-f3e1-account-create-psn2x" Oct 07 19:19:57 crc kubenswrapper[4825]: I1007 19:19:57.874796 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4hgb\" (UniqueName: \"kubernetes.io/projected/676ed7ae-5a2c-4680-9138-81a1cf620b00-kube-api-access-x4hgb\") pod \"nova-cell0-f3e1-account-create-psn2x\" (UID: \"676ed7ae-5a2c-4680-9138-81a1cf620b00\") " pod="openstack/nova-cell0-f3e1-account-create-psn2x" Oct 07 19:19:57 crc kubenswrapper[4825]: I1007 19:19:57.947402 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jwtb\" (UniqueName: \"kubernetes.io/projected/50aee8af-3586-4ace-9c5a-4046a5af52e1-kube-api-access-9jwtb\") pod \"nova-cell1-d2e4-account-create-6vqxn\" (UID: \"50aee8af-3586-4ace-9c5a-4046a5af52e1\") " pod="openstack/nova-cell1-d2e4-account-create-6vqxn" Oct 07 19:19:58 crc kubenswrapper[4825]: I1007 19:19:58.049577 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jwtb\" (UniqueName: \"kubernetes.io/projected/50aee8af-3586-4ace-9c5a-4046a5af52e1-kube-api-access-9jwtb\") pod \"nova-cell1-d2e4-account-create-6vqxn\" (UID: \"50aee8af-3586-4ace-9c5a-4046a5af52e1\") " pod="openstack/nova-cell1-d2e4-account-create-6vqxn" Oct 07 19:19:58 crc kubenswrapper[4825]: I1007 19:19:58.051466 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f3e1-account-create-psn2x" Oct 07 19:19:58 crc kubenswrapper[4825]: I1007 19:19:58.069088 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jwtb\" (UniqueName: \"kubernetes.io/projected/50aee8af-3586-4ace-9c5a-4046a5af52e1-kube-api-access-9jwtb\") pod \"nova-cell1-d2e4-account-create-6vqxn\" (UID: \"50aee8af-3586-4ace-9c5a-4046a5af52e1\") " pod="openstack/nova-cell1-d2e4-account-create-6vqxn" Oct 07 19:19:58 crc kubenswrapper[4825]: I1007 19:19:58.190004 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d2e4-account-create-6vqxn" Oct 07 19:19:58 crc kubenswrapper[4825]: I1007 19:19:58.248260 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e266-account-create-crc5b"] Oct 07 19:19:58 crc kubenswrapper[4825]: I1007 19:19:58.493531 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-f3e1-account-create-psn2x"] Oct 07 19:19:58 crc kubenswrapper[4825]: I1007 19:19:58.512682 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 07 19:19:58 crc kubenswrapper[4825]: I1007 19:19:58.525481 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40b6f4a6-be77-4451-b3a9-f8243334c779","Type":"ContainerStarted","Data":"c1fa8c6f2e474f40c8bbffc7890082ecdd97553137f94aa27aebe0054bbed8e1"} Oct 07 19:19:58 crc kubenswrapper[4825]: I1007 19:19:58.525860 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 19:19:58 crc kubenswrapper[4825]: I1007 19:19:58.543000 4825 generic.go:334] "Generic (PLEG): container finished" podID="d6310f91-ae90-4fd6-a4a5-51a43304420a" containerID="53f77e7aa53f7dceda6d19d9e57c19451db12591126ef950e7405e4496863975" exitCode=0 Oct 07 19:19:58 crc kubenswrapper[4825]: I1007 19:19:58.543883 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e266-account-create-crc5b" event={"ID":"d6310f91-ae90-4fd6-a4a5-51a43304420a","Type":"ContainerDied","Data":"53f77e7aa53f7dceda6d19d9e57c19451db12591126ef950e7405e4496863975"} Oct 07 19:19:58 crc kubenswrapper[4825]: I1007 19:19:58.543922 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e266-account-create-crc5b" event={"ID":"d6310f91-ae90-4fd6-a4a5-51a43304420a","Type":"ContainerStarted","Data":"9fa5f52be136e0b131bcb6ba1e5d2cc9c3fc419477c2536af033610b397f29d0"} Oct 07 19:19:58 crc kubenswrapper[4825]: I1007 19:19:58.587392 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.315148858 podStartE2EDuration="5.587376057s" podCreationTimestamp="2025-10-07 19:19:53 +0000 UTC" firstStartedPulling="2025-10-07 19:19:54.264710301 +0000 UTC m=+1183.086748938" lastFinishedPulling="2025-10-07 19:19:57.5369375 +0000 UTC m=+1186.358976137" observedRunningTime="2025-10-07 19:19:58.568639992 +0000 UTC m=+1187.390678629" watchObservedRunningTime="2025-10-07 19:19:58.587376057 +0000 UTC m=+1187.409414694" Oct 07 19:19:58 crc kubenswrapper[4825]: I1007 19:19:58.645278 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d2e4-account-create-6vqxn"] Oct 07 19:19:59 crc kubenswrapper[4825]: I1007 19:19:59.559918 4825 generic.go:334] "Generic (PLEG): container finished" podID="50aee8af-3586-4ace-9c5a-4046a5af52e1" containerID="48c4c70bacf7a002e12f9e617d56bd1e5679a77d86655fa4831c360b8b9dee78" exitCode=0 Oct 07 19:19:59 crc kubenswrapper[4825]: I1007 19:19:59.560041 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d2e4-account-create-6vqxn" event={"ID":"50aee8af-3586-4ace-9c5a-4046a5af52e1","Type":"ContainerDied","Data":"48c4c70bacf7a002e12f9e617d56bd1e5679a77d86655fa4831c360b8b9dee78"} Oct 07 19:19:59 crc kubenswrapper[4825]: I1007 19:19:59.560311 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d2e4-account-create-6vqxn" event={"ID":"50aee8af-3586-4ace-9c5a-4046a5af52e1","Type":"ContainerStarted","Data":"133b536bc1ccf27e4fc1f7c296c8374a4eb10a57df5a9487407dec00332606a9"} Oct 07 19:19:59 crc kubenswrapper[4825]: I1007 19:19:59.562703 4825 generic.go:334] "Generic (PLEG): container finished" podID="676ed7ae-5a2c-4680-9138-81a1cf620b00" containerID="10854187a6a87534d43ea89325e40bfdddb3e63a16f27ce9866afcdbaed1237d" exitCode=0 Oct 07 19:19:59 crc kubenswrapper[4825]: I1007 19:19:59.562844 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f3e1-account-create-psn2x" event={"ID":"676ed7ae-5a2c-4680-9138-81a1cf620b00","Type":"ContainerDied","Data":"10854187a6a87534d43ea89325e40bfdddb3e63a16f27ce9866afcdbaed1237d"} Oct 07 19:19:59 crc kubenswrapper[4825]: I1007 19:19:59.562914 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f3e1-account-create-psn2x" event={"ID":"676ed7ae-5a2c-4680-9138-81a1cf620b00","Type":"ContainerStarted","Data":"10f39a2c6b6b8218089229b12006b8c00aa0bf97940ea6e1c92990201a635de6"} Oct 07 19:19:59 crc kubenswrapper[4825]: I1007 19:19:59.752691 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 19:19:59 crc kubenswrapper[4825]: I1007 19:19:59.753297 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 19:19:59 crc kubenswrapper[4825]: I1007 19:19:59.802156 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 19:19:59 crc kubenswrapper[4825]: I1007 19:19:59.824116 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 19:19:59 crc kubenswrapper[4825]: I1007 19:19:59.957465 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e266-account-create-crc5b" Oct 07 19:20:00 crc kubenswrapper[4825]: I1007 19:20:00.084881 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztgqg\" (UniqueName: \"kubernetes.io/projected/d6310f91-ae90-4fd6-a4a5-51a43304420a-kube-api-access-ztgqg\") pod \"d6310f91-ae90-4fd6-a4a5-51a43304420a\" (UID: \"d6310f91-ae90-4fd6-a4a5-51a43304420a\") " Oct 07 19:20:00 crc kubenswrapper[4825]: I1007 19:20:00.090042 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6310f91-ae90-4fd6-a4a5-51a43304420a-kube-api-access-ztgqg" (OuterVolumeSpecName: "kube-api-access-ztgqg") pod "d6310f91-ae90-4fd6-a4a5-51a43304420a" (UID: "d6310f91-ae90-4fd6-a4a5-51a43304420a"). InnerVolumeSpecName "kube-api-access-ztgqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:20:00 crc kubenswrapper[4825]: I1007 19:20:00.186883 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztgqg\" (UniqueName: \"kubernetes.io/projected/d6310f91-ae90-4fd6-a4a5-51a43304420a-kube-api-access-ztgqg\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:00 crc kubenswrapper[4825]: I1007 19:20:00.575716 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e266-account-create-crc5b" Oct 07 19:20:00 crc kubenswrapper[4825]: I1007 19:20:00.575706 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e266-account-create-crc5b" event={"ID":"d6310f91-ae90-4fd6-a4a5-51a43304420a","Type":"ContainerDied","Data":"9fa5f52be136e0b131bcb6ba1e5d2cc9c3fc419477c2536af033610b397f29d0"} Oct 07 19:20:00 crc kubenswrapper[4825]: I1007 19:20:00.578402 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fa5f52be136e0b131bcb6ba1e5d2cc9c3fc419477c2536af033610b397f29d0" Oct 07 19:20:00 crc kubenswrapper[4825]: I1007 19:20:00.578444 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 19:20:00 crc kubenswrapper[4825]: I1007 19:20:00.578474 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 19:20:01 crc kubenswrapper[4825]: I1007 19:20:01.063503 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d2e4-account-create-6vqxn" Oct 07 19:20:01 crc kubenswrapper[4825]: I1007 19:20:01.068710 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f3e1-account-create-psn2x" Oct 07 19:20:01 crc kubenswrapper[4825]: I1007 19:20:01.207438 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4hgb\" (UniqueName: \"kubernetes.io/projected/676ed7ae-5a2c-4680-9138-81a1cf620b00-kube-api-access-x4hgb\") pod \"676ed7ae-5a2c-4680-9138-81a1cf620b00\" (UID: \"676ed7ae-5a2c-4680-9138-81a1cf620b00\") " Oct 07 19:20:01 crc kubenswrapper[4825]: I1007 19:20:01.207517 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jwtb\" (UniqueName: \"kubernetes.io/projected/50aee8af-3586-4ace-9c5a-4046a5af52e1-kube-api-access-9jwtb\") pod \"50aee8af-3586-4ace-9c5a-4046a5af52e1\" (UID: \"50aee8af-3586-4ace-9c5a-4046a5af52e1\") " Oct 07 19:20:01 crc kubenswrapper[4825]: I1007 19:20:01.213280 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/676ed7ae-5a2c-4680-9138-81a1cf620b00-kube-api-access-x4hgb" (OuterVolumeSpecName: "kube-api-access-x4hgb") pod "676ed7ae-5a2c-4680-9138-81a1cf620b00" (UID: "676ed7ae-5a2c-4680-9138-81a1cf620b00"). InnerVolumeSpecName "kube-api-access-x4hgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:20:01 crc kubenswrapper[4825]: I1007 19:20:01.213358 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50aee8af-3586-4ace-9c5a-4046a5af52e1-kube-api-access-9jwtb" (OuterVolumeSpecName: "kube-api-access-9jwtb") pod "50aee8af-3586-4ace-9c5a-4046a5af52e1" (UID: "50aee8af-3586-4ace-9c5a-4046a5af52e1"). InnerVolumeSpecName "kube-api-access-9jwtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:20:01 crc kubenswrapper[4825]: I1007 19:20:01.309274 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4hgb\" (UniqueName: \"kubernetes.io/projected/676ed7ae-5a2c-4680-9138-81a1cf620b00-kube-api-access-x4hgb\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:01 crc kubenswrapper[4825]: I1007 19:20:01.309306 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jwtb\" (UniqueName: \"kubernetes.io/projected/50aee8af-3586-4ace-9c5a-4046a5af52e1-kube-api-access-9jwtb\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:01 crc kubenswrapper[4825]: I1007 19:20:01.613548 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d2e4-account-create-6vqxn" Oct 07 19:20:01 crc kubenswrapper[4825]: I1007 19:20:01.614439 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d2e4-account-create-6vqxn" event={"ID":"50aee8af-3586-4ace-9c5a-4046a5af52e1","Type":"ContainerDied","Data":"133b536bc1ccf27e4fc1f7c296c8374a4eb10a57df5a9487407dec00332606a9"} Oct 07 19:20:01 crc kubenswrapper[4825]: I1007 19:20:01.614562 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="133b536bc1ccf27e4fc1f7c296c8374a4eb10a57df5a9487407dec00332606a9" Oct 07 19:20:01 crc kubenswrapper[4825]: I1007 19:20:01.633763 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f3e1-account-create-psn2x" Oct 07 19:20:01 crc kubenswrapper[4825]: I1007 19:20:01.634075 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f3e1-account-create-psn2x" event={"ID":"676ed7ae-5a2c-4680-9138-81a1cf620b00","Type":"ContainerDied","Data":"10f39a2c6b6b8218089229b12006b8c00aa0bf97940ea6e1c92990201a635de6"} Oct 07 19:20:01 crc kubenswrapper[4825]: I1007 19:20:01.634167 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10f39a2c6b6b8218089229b12006b8c00aa0bf97940ea6e1c92990201a635de6" Oct 07 19:20:02 crc kubenswrapper[4825]: I1007 19:20:02.528832 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 19:20:02 crc kubenswrapper[4825]: I1007 19:20:02.530414 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 19:20:02 crc kubenswrapper[4825]: I1007 19:20:02.884096 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cxfw9"] Oct 07 19:20:02 crc kubenswrapper[4825]: E1007 19:20:02.884529 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6310f91-ae90-4fd6-a4a5-51a43304420a" containerName="mariadb-account-create" Oct 07 19:20:02 crc kubenswrapper[4825]: I1007 19:20:02.884544 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6310f91-ae90-4fd6-a4a5-51a43304420a" containerName="mariadb-account-create" Oct 07 19:20:02 crc kubenswrapper[4825]: E1007 19:20:02.884577 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50aee8af-3586-4ace-9c5a-4046a5af52e1" containerName="mariadb-account-create" Oct 07 19:20:02 crc kubenswrapper[4825]: I1007 19:20:02.884583 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="50aee8af-3586-4ace-9c5a-4046a5af52e1" containerName="mariadb-account-create" Oct 07 19:20:02 crc kubenswrapper[4825]: E1007 19:20:02.884593 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="676ed7ae-5a2c-4680-9138-81a1cf620b00" containerName="mariadb-account-create" Oct 07 19:20:02 crc kubenswrapper[4825]: I1007 19:20:02.884600 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="676ed7ae-5a2c-4680-9138-81a1cf620b00" containerName="mariadb-account-create" Oct 07 19:20:02 crc kubenswrapper[4825]: I1007 19:20:02.884805 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6310f91-ae90-4fd6-a4a5-51a43304420a" containerName="mariadb-account-create" Oct 07 19:20:02 crc kubenswrapper[4825]: I1007 19:20:02.884832 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="50aee8af-3586-4ace-9c5a-4046a5af52e1" containerName="mariadb-account-create" Oct 07 19:20:02 crc kubenswrapper[4825]: I1007 19:20:02.884844 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="676ed7ae-5a2c-4680-9138-81a1cf620b00" containerName="mariadb-account-create" Oct 07 19:20:02 crc kubenswrapper[4825]: I1007 19:20:02.885537 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cxfw9" Oct 07 19:20:02 crc kubenswrapper[4825]: I1007 19:20:02.887358 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 07 19:20:02 crc kubenswrapper[4825]: I1007 19:20:02.887588 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-q9twx" Oct 07 19:20:02 crc kubenswrapper[4825]: I1007 19:20:02.887777 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 07 19:20:02 crc kubenswrapper[4825]: I1007 19:20:02.897497 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cxfw9"] Oct 07 19:20:02 crc kubenswrapper[4825]: I1007 19:20:02.937550 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cxfw9\" (UID: \"b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88\") " pod="openstack/nova-cell0-conductor-db-sync-cxfw9" Oct 07 19:20:02 crc kubenswrapper[4825]: I1007 19:20:02.937645 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88-scripts\") pod \"nova-cell0-conductor-db-sync-cxfw9\" (UID: \"b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88\") " pod="openstack/nova-cell0-conductor-db-sync-cxfw9" Oct 07 19:20:02 crc kubenswrapper[4825]: I1007 19:20:02.937691 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88-config-data\") pod \"nova-cell0-conductor-db-sync-cxfw9\" (UID: \"b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88\") " pod="openstack/nova-cell0-conductor-db-sync-cxfw9" Oct 07 19:20:02 crc kubenswrapper[4825]: I1007 19:20:02.937712 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hf2x\" (UniqueName: \"kubernetes.io/projected/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88-kube-api-access-5hf2x\") pod \"nova-cell0-conductor-db-sync-cxfw9\" (UID: \"b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88\") " pod="openstack/nova-cell0-conductor-db-sync-cxfw9" Oct 07 19:20:03 crc kubenswrapper[4825]: I1007 19:20:03.039610 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cxfw9\" (UID: \"b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88\") " pod="openstack/nova-cell0-conductor-db-sync-cxfw9" Oct 07 19:20:03 crc kubenswrapper[4825]: I1007 19:20:03.039742 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88-scripts\") pod \"nova-cell0-conductor-db-sync-cxfw9\" (UID: \"b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88\") " pod="openstack/nova-cell0-conductor-db-sync-cxfw9" Oct 07 19:20:03 crc kubenswrapper[4825]: I1007 19:20:03.039801 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88-config-data\") pod \"nova-cell0-conductor-db-sync-cxfw9\" (UID: \"b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88\") " pod="openstack/nova-cell0-conductor-db-sync-cxfw9" Oct 07 19:20:03 crc kubenswrapper[4825]: I1007 19:20:03.039832 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hf2x\" (UniqueName: \"kubernetes.io/projected/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88-kube-api-access-5hf2x\") pod \"nova-cell0-conductor-db-sync-cxfw9\" (UID: \"b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88\") " pod="openstack/nova-cell0-conductor-db-sync-cxfw9" Oct 07 19:20:03 crc kubenswrapper[4825]: I1007 19:20:03.046997 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88-scripts\") pod \"nova-cell0-conductor-db-sync-cxfw9\" (UID: \"b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88\") " pod="openstack/nova-cell0-conductor-db-sync-cxfw9" Oct 07 19:20:03 crc kubenswrapper[4825]: I1007 19:20:03.047632 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88-config-data\") pod \"nova-cell0-conductor-db-sync-cxfw9\" (UID: \"b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88\") " pod="openstack/nova-cell0-conductor-db-sync-cxfw9" Oct 07 19:20:03 crc kubenswrapper[4825]: I1007 19:20:03.058123 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cxfw9\" (UID: \"b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88\") " pod="openstack/nova-cell0-conductor-db-sync-cxfw9" Oct 07 19:20:03 crc kubenswrapper[4825]: I1007 19:20:03.069456 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hf2x\" (UniqueName: \"kubernetes.io/projected/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88-kube-api-access-5hf2x\") pod \"nova-cell0-conductor-db-sync-cxfw9\" (UID: \"b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88\") " pod="openstack/nova-cell0-conductor-db-sync-cxfw9" Oct 07 19:20:03 crc kubenswrapper[4825]: I1007 19:20:03.205251 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cxfw9" Oct 07 19:20:03 crc kubenswrapper[4825]: I1007 19:20:03.724336 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cxfw9"] Oct 07 19:20:04 crc kubenswrapper[4825]: I1007 19:20:04.662514 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cxfw9" event={"ID":"b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88","Type":"ContainerStarted","Data":"1690a517c647942e66fd227756e1977991a35368afc4d0f7c515610cf322ffd5"} Oct 07 19:20:05 crc kubenswrapper[4825]: I1007 19:20:05.708632 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:20:05 crc kubenswrapper[4825]: I1007 19:20:05.708703 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:20:05 crc kubenswrapper[4825]: I1007 19:20:05.879850 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 19:20:05 crc kubenswrapper[4825]: I1007 19:20:05.879910 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 19:20:05 crc kubenswrapper[4825]: I1007 19:20:05.909270 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 19:20:05 crc kubenswrapper[4825]: I1007 19:20:05.934997 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 19:20:06 crc kubenswrapper[4825]: I1007 19:20:06.691753 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 19:20:06 crc kubenswrapper[4825]: I1007 19:20:06.691872 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 19:20:07 crc kubenswrapper[4825]: I1007 19:20:07.523281 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:20:07 crc kubenswrapper[4825]: I1007 19:20:07.523616 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="40b6f4a6-be77-4451-b3a9-f8243334c779" containerName="ceilometer-central-agent" containerID="cri-o://9820a85ac927bd697e1894c6845ba87cd82f90bce6c3b848c17122a5ace374b5" gracePeriod=30 Oct 07 19:20:07 crc kubenswrapper[4825]: I1007 19:20:07.524342 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="40b6f4a6-be77-4451-b3a9-f8243334c779" containerName="proxy-httpd" containerID="cri-o://c1fa8c6f2e474f40c8bbffc7890082ecdd97553137f94aa27aebe0054bbed8e1" gracePeriod=30 Oct 07 19:20:07 crc kubenswrapper[4825]: I1007 19:20:07.524497 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="40b6f4a6-be77-4451-b3a9-f8243334c779" containerName="sg-core" containerID="cri-o://9bef7754332b0e4c5492d59e6d877e3468d43099a4c18207670272fa4ee97332" gracePeriod=30 Oct 07 19:20:07 crc kubenswrapper[4825]: I1007 19:20:07.524564 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="40b6f4a6-be77-4451-b3a9-f8243334c779" containerName="ceilometer-notification-agent" containerID="cri-o://bb78dce91a684275a08ecbbbe64b95939eb22930bd1c9fbba4f6f0dab445d4ea" gracePeriod=30 Oct 07 19:20:07 crc kubenswrapper[4825]: I1007 19:20:07.543036 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="40b6f4a6-be77-4451-b3a9-f8243334c779" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.179:3000/\": EOF" Oct 07 19:20:07 crc kubenswrapper[4825]: I1007 19:20:07.703468 4825 generic.go:334] "Generic (PLEG): container finished" podID="40b6f4a6-be77-4451-b3a9-f8243334c779" containerID="9bef7754332b0e4c5492d59e6d877e3468d43099a4c18207670272fa4ee97332" exitCode=2 Oct 07 19:20:07 crc kubenswrapper[4825]: I1007 19:20:07.703515 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40b6f4a6-be77-4451-b3a9-f8243334c779","Type":"ContainerDied","Data":"9bef7754332b0e4c5492d59e6d877e3468d43099a4c18207670272fa4ee97332"} Oct 07 19:20:08 crc kubenswrapper[4825]: I1007 19:20:08.600120 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 19:20:08 crc kubenswrapper[4825]: I1007 19:20:08.640023 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 19:20:08 crc kubenswrapper[4825]: I1007 19:20:08.748217 4825 generic.go:334] "Generic (PLEG): container finished" podID="40b6f4a6-be77-4451-b3a9-f8243334c779" containerID="c1fa8c6f2e474f40c8bbffc7890082ecdd97553137f94aa27aebe0054bbed8e1" exitCode=0 Oct 07 19:20:08 crc kubenswrapper[4825]: I1007 19:20:08.748294 4825 generic.go:334] "Generic (PLEG): container finished" podID="40b6f4a6-be77-4451-b3a9-f8243334c779" containerID="bb78dce91a684275a08ecbbbe64b95939eb22930bd1c9fbba4f6f0dab445d4ea" exitCode=0 Oct 07 19:20:08 crc kubenswrapper[4825]: I1007 19:20:08.748304 4825 generic.go:334] "Generic (PLEG): container finished" podID="40b6f4a6-be77-4451-b3a9-f8243334c779" containerID="9820a85ac927bd697e1894c6845ba87cd82f90bce6c3b848c17122a5ace374b5" exitCode=0 Oct 07 19:20:08 crc kubenswrapper[4825]: I1007 19:20:08.749837 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40b6f4a6-be77-4451-b3a9-f8243334c779","Type":"ContainerDied","Data":"c1fa8c6f2e474f40c8bbffc7890082ecdd97553137f94aa27aebe0054bbed8e1"} Oct 07 19:20:08 crc kubenswrapper[4825]: I1007 19:20:08.749889 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40b6f4a6-be77-4451-b3a9-f8243334c779","Type":"ContainerDied","Data":"bb78dce91a684275a08ecbbbe64b95939eb22930bd1c9fbba4f6f0dab445d4ea"} Oct 07 19:20:08 crc kubenswrapper[4825]: I1007 19:20:08.749906 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40b6f4a6-be77-4451-b3a9-f8243334c779","Type":"ContainerDied","Data":"9820a85ac927bd697e1894c6845ba87cd82f90bce6c3b848c17122a5ace374b5"} Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.328518 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.455576 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-scripts\") pod \"40b6f4a6-be77-4451-b3a9-f8243334c779\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.455716 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40b6f4a6-be77-4451-b3a9-f8243334c779-run-httpd\") pod \"40b6f4a6-be77-4451-b3a9-f8243334c779\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.455746 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40b6f4a6-be77-4451-b3a9-f8243334c779-log-httpd\") pod \"40b6f4a6-be77-4451-b3a9-f8243334c779\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.455804 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-sg-core-conf-yaml\") pod \"40b6f4a6-be77-4451-b3a9-f8243334c779\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.455835 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-combined-ca-bundle\") pod \"40b6f4a6-be77-4451-b3a9-f8243334c779\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.455865 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57jp5\" (UniqueName: \"kubernetes.io/projected/40b6f4a6-be77-4451-b3a9-f8243334c779-kube-api-access-57jp5\") pod \"40b6f4a6-be77-4451-b3a9-f8243334c779\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.455914 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-ceilometer-tls-certs\") pod \"40b6f4a6-be77-4451-b3a9-f8243334c779\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.455934 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-config-data\") pod \"40b6f4a6-be77-4451-b3a9-f8243334c779\" (UID: \"40b6f4a6-be77-4451-b3a9-f8243334c779\") " Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.456813 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40b6f4a6-be77-4451-b3a9-f8243334c779-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "40b6f4a6-be77-4451-b3a9-f8243334c779" (UID: "40b6f4a6-be77-4451-b3a9-f8243334c779"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.460395 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-scripts" (OuterVolumeSpecName: "scripts") pod "40b6f4a6-be77-4451-b3a9-f8243334c779" (UID: "40b6f4a6-be77-4451-b3a9-f8243334c779"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.461200 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40b6f4a6-be77-4451-b3a9-f8243334c779-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "40b6f4a6-be77-4451-b3a9-f8243334c779" (UID: "40b6f4a6-be77-4451-b3a9-f8243334c779"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.465833 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40b6f4a6-be77-4451-b3a9-f8243334c779-kube-api-access-57jp5" (OuterVolumeSpecName: "kube-api-access-57jp5") pod "40b6f4a6-be77-4451-b3a9-f8243334c779" (UID: "40b6f4a6-be77-4451-b3a9-f8243334c779"). InnerVolumeSpecName "kube-api-access-57jp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.482972 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "40b6f4a6-be77-4451-b3a9-f8243334c779" (UID: "40b6f4a6-be77-4451-b3a9-f8243334c779"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.519443 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "40b6f4a6-be77-4451-b3a9-f8243334c779" (UID: "40b6f4a6-be77-4451-b3a9-f8243334c779"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.537601 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40b6f4a6-be77-4451-b3a9-f8243334c779" (UID: "40b6f4a6-be77-4451-b3a9-f8243334c779"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.558042 4825 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.558071 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.558080 4825 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40b6f4a6-be77-4451-b3a9-f8243334c779-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.558088 4825 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40b6f4a6-be77-4451-b3a9-f8243334c779-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.558096 4825 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.558104 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.558112 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57jp5\" (UniqueName: \"kubernetes.io/projected/40b6f4a6-be77-4451-b3a9-f8243334c779-kube-api-access-57jp5\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.570830 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-config-data" (OuterVolumeSpecName: "config-data") pod "40b6f4a6-be77-4451-b3a9-f8243334c779" (UID: "40b6f4a6-be77-4451-b3a9-f8243334c779"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.659369 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b6f4a6-be77-4451-b3a9-f8243334c779-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.842884 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cxfw9" event={"ID":"b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88","Type":"ContainerStarted","Data":"e02641bc6aef60408675db5ace8a39d6fb928a411b49e0d00102258fa8b5d35a"} Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.848901 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40b6f4a6-be77-4451-b3a9-f8243334c779","Type":"ContainerDied","Data":"e41177f96bc30fe871c24a908bd0cf4f17b39b08ca404d7f3da60a0c78f52637"} Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.848957 4825 scope.go:117] "RemoveContainer" containerID="c1fa8c6f2e474f40c8bbffc7890082ecdd97553137f94aa27aebe0054bbed8e1" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.849065 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.872621 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-cxfw9" podStartSLOduration=2.517163857 podStartE2EDuration="11.87260429s" podCreationTimestamp="2025-10-07 19:20:02 +0000 UTC" firstStartedPulling="2025-10-07 19:20:03.749834941 +0000 UTC m=+1192.571873588" lastFinishedPulling="2025-10-07 19:20:13.105275384 +0000 UTC m=+1201.927314021" observedRunningTime="2025-10-07 19:20:13.870878856 +0000 UTC m=+1202.692917493" watchObservedRunningTime="2025-10-07 19:20:13.87260429 +0000 UTC m=+1202.694642927" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.885356 4825 scope.go:117] "RemoveContainer" containerID="9bef7754332b0e4c5492d59e6d877e3468d43099a4c18207670272fa4ee97332" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.930765 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.962878 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.975289 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:20:13 crc kubenswrapper[4825]: E1007 19:20:13.975676 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40b6f4a6-be77-4451-b3a9-f8243334c779" containerName="ceilometer-notification-agent" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.975688 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b6f4a6-be77-4451-b3a9-f8243334c779" containerName="ceilometer-notification-agent" Oct 07 19:20:13 crc kubenswrapper[4825]: E1007 19:20:13.975705 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40b6f4a6-be77-4451-b3a9-f8243334c779" containerName="proxy-httpd" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.975711 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b6f4a6-be77-4451-b3a9-f8243334c779" containerName="proxy-httpd" Oct 07 19:20:13 crc kubenswrapper[4825]: E1007 19:20:13.975728 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40b6f4a6-be77-4451-b3a9-f8243334c779" containerName="sg-core" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.975734 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b6f4a6-be77-4451-b3a9-f8243334c779" containerName="sg-core" Oct 07 19:20:13 crc kubenswrapper[4825]: E1007 19:20:13.975745 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40b6f4a6-be77-4451-b3a9-f8243334c779" containerName="ceilometer-central-agent" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.975751 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b6f4a6-be77-4451-b3a9-f8243334c779" containerName="ceilometer-central-agent" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.975950 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="40b6f4a6-be77-4451-b3a9-f8243334c779" containerName="ceilometer-central-agent" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.975965 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="40b6f4a6-be77-4451-b3a9-f8243334c779" containerName="proxy-httpd" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.975976 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="40b6f4a6-be77-4451-b3a9-f8243334c779" containerName="sg-core" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.975990 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="40b6f4a6-be77-4451-b3a9-f8243334c779" containerName="ceilometer-notification-agent" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.977880 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.980168 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.982170 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.982424 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.982648 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 19:20:13 crc kubenswrapper[4825]: I1007 19:20:13.986365 4825 scope.go:117] "RemoveContainer" containerID="bb78dce91a684275a08ecbbbe64b95939eb22930bd1c9fbba4f6f0dab445d4ea" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.011221 4825 scope.go:117] "RemoveContainer" containerID="9820a85ac927bd697e1894c6845ba87cd82f90bce6c3b848c17122a5ace374b5" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.067467 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.067507 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dztw5\" (UniqueName: \"kubernetes.io/projected/1e3f5992-6777-4a65-b0f2-2510b363b9bf-kube-api-access-dztw5\") pod \"ceilometer-0\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.067545 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e3f5992-6777-4a65-b0f2-2510b363b9bf-run-httpd\") pod \"ceilometer-0\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.067633 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-config-data\") pod \"ceilometer-0\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.067664 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.067700 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-scripts\") pod \"ceilometer-0\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.067717 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e3f5992-6777-4a65-b0f2-2510b363b9bf-log-httpd\") pod \"ceilometer-0\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.067817 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.170356 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.170461 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.170490 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dztw5\" (UniqueName: \"kubernetes.io/projected/1e3f5992-6777-4a65-b0f2-2510b363b9bf-kube-api-access-dztw5\") pod \"ceilometer-0\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.170536 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e3f5992-6777-4a65-b0f2-2510b363b9bf-run-httpd\") pod \"ceilometer-0\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.170591 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-config-data\") pod \"ceilometer-0\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.170621 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.170660 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-scripts\") pod \"ceilometer-0\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.170681 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e3f5992-6777-4a65-b0f2-2510b363b9bf-log-httpd\") pod \"ceilometer-0\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.171128 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e3f5992-6777-4a65-b0f2-2510b363b9bf-log-httpd\") pod \"ceilometer-0\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.171558 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e3f5992-6777-4a65-b0f2-2510b363b9bf-run-httpd\") pod \"ceilometer-0\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.175551 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.176291 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.184019 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-scripts\") pod \"ceilometer-0\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.184737 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.186145 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-config-data\") pod \"ceilometer-0\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.190971 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dztw5\" (UniqueName: \"kubernetes.io/projected/1e3f5992-6777-4a65-b0f2-2510b363b9bf-kube-api-access-dztw5\") pod \"ceilometer-0\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.299154 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.829785 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:20:14 crc kubenswrapper[4825]: I1007 19:20:14.868261 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e3f5992-6777-4a65-b0f2-2510b363b9bf","Type":"ContainerStarted","Data":"01ca31fb3a0dfcda7f5eb72fd774fbdc52dd6f1c6da4e8269bacff612a37f3cd"} Oct 07 19:20:15 crc kubenswrapper[4825]: I1007 19:20:15.808126 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40b6f4a6-be77-4451-b3a9-f8243334c779" path="/var/lib/kubelet/pods/40b6f4a6-be77-4451-b3a9-f8243334c779/volumes" Oct 07 19:20:16 crc kubenswrapper[4825]: I1007 19:20:16.893329 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e3f5992-6777-4a65-b0f2-2510b363b9bf","Type":"ContainerStarted","Data":"be34757daaae4d77314af2ecb9877c1dc99dd569f5b016067b6ff2a40eb57c1f"} Oct 07 19:20:17 crc kubenswrapper[4825]: I1007 19:20:17.904697 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e3f5992-6777-4a65-b0f2-2510b363b9bf","Type":"ContainerStarted","Data":"dbd2151af2d21196aacee3d29dc47d2ede9736876a10148966f348386fd0f9f8"} Oct 07 19:20:17 crc kubenswrapper[4825]: I1007 19:20:17.904992 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e3f5992-6777-4a65-b0f2-2510b363b9bf","Type":"ContainerStarted","Data":"df8fda5ab9c6f67358652fbd71becdb48b8deb099386ecb41a61fdbd12684530"} Oct 07 19:20:19 crc kubenswrapper[4825]: I1007 19:20:19.926006 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e3f5992-6777-4a65-b0f2-2510b363b9bf","Type":"ContainerStarted","Data":"8f1cf1b07206834cf406f60bb880d5ad6158177e084f3331ea9deae2db7d2102"} Oct 07 19:20:19 crc kubenswrapper[4825]: I1007 19:20:19.926640 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 19:20:23 crc kubenswrapper[4825]: I1007 19:20:23.991882 4825 generic.go:334] "Generic (PLEG): container finished" podID="b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88" containerID="e02641bc6aef60408675db5ace8a39d6fb928a411b49e0d00102258fa8b5d35a" exitCode=0 Oct 07 19:20:23 crc kubenswrapper[4825]: I1007 19:20:23.991953 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cxfw9" event={"ID":"b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88","Type":"ContainerDied","Data":"e02641bc6aef60408675db5ace8a39d6fb928a411b49e0d00102258fa8b5d35a"} Oct 07 19:20:24 crc kubenswrapper[4825]: I1007 19:20:24.021419 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.886419181 podStartE2EDuration="11.021394726s" podCreationTimestamp="2025-10-07 19:20:13 +0000 UTC" firstStartedPulling="2025-10-07 19:20:14.84656782 +0000 UTC m=+1203.668606457" lastFinishedPulling="2025-10-07 19:20:18.981543365 +0000 UTC m=+1207.803582002" observedRunningTime="2025-10-07 19:20:19.960601486 +0000 UTC m=+1208.782640143" watchObservedRunningTime="2025-10-07 19:20:24.021394726 +0000 UTC m=+1212.843433373" Oct 07 19:20:25 crc kubenswrapper[4825]: I1007 19:20:25.417637 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cxfw9" Oct 07 19:20:25 crc kubenswrapper[4825]: I1007 19:20:25.506036 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88-scripts\") pod \"b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88\" (UID: \"b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88\") " Oct 07 19:20:25 crc kubenswrapper[4825]: I1007 19:20:25.506104 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hf2x\" (UniqueName: \"kubernetes.io/projected/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88-kube-api-access-5hf2x\") pod \"b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88\" (UID: \"b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88\") " Oct 07 19:20:25 crc kubenswrapper[4825]: I1007 19:20:25.506295 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88-combined-ca-bundle\") pod \"b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88\" (UID: \"b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88\") " Oct 07 19:20:25 crc kubenswrapper[4825]: I1007 19:20:25.506355 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88-config-data\") pod \"b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88\" (UID: \"b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88\") " Oct 07 19:20:25 crc kubenswrapper[4825]: I1007 19:20:25.512316 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88-scripts" (OuterVolumeSpecName: "scripts") pod "b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88" (UID: "b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:20:25 crc kubenswrapper[4825]: I1007 19:20:25.512439 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88-kube-api-access-5hf2x" (OuterVolumeSpecName: "kube-api-access-5hf2x") pod "b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88" (UID: "b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88"). InnerVolumeSpecName "kube-api-access-5hf2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:20:25 crc kubenswrapper[4825]: I1007 19:20:25.542325 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88" (UID: "b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:20:25 crc kubenswrapper[4825]: I1007 19:20:25.562509 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88-config-data" (OuterVolumeSpecName: "config-data") pod "b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88" (UID: "b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:20:25 crc kubenswrapper[4825]: I1007 19:20:25.609071 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:25 crc kubenswrapper[4825]: I1007 19:20:25.609114 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:25 crc kubenswrapper[4825]: I1007 19:20:25.609127 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:25 crc kubenswrapper[4825]: I1007 19:20:25.609138 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hf2x\" (UniqueName: \"kubernetes.io/projected/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88-kube-api-access-5hf2x\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:26 crc kubenswrapper[4825]: I1007 19:20:26.016279 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cxfw9" event={"ID":"b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88","Type":"ContainerDied","Data":"1690a517c647942e66fd227756e1977991a35368afc4d0f7c515610cf322ffd5"} Oct 07 19:20:26 crc kubenswrapper[4825]: I1007 19:20:26.016318 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1690a517c647942e66fd227756e1977991a35368afc4d0f7c515610cf322ffd5" Oct 07 19:20:26 crc kubenswrapper[4825]: I1007 19:20:26.016350 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cxfw9" Oct 07 19:20:26 crc kubenswrapper[4825]: I1007 19:20:26.115361 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 19:20:26 crc kubenswrapper[4825]: E1007 19:20:26.115819 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88" containerName="nova-cell0-conductor-db-sync" Oct 07 19:20:26 crc kubenswrapper[4825]: I1007 19:20:26.115839 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88" containerName="nova-cell0-conductor-db-sync" Oct 07 19:20:26 crc kubenswrapper[4825]: I1007 19:20:26.116083 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88" containerName="nova-cell0-conductor-db-sync" Oct 07 19:20:26 crc kubenswrapper[4825]: I1007 19:20:26.116783 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 19:20:26 crc kubenswrapper[4825]: I1007 19:20:26.119132 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-q9twx" Oct 07 19:20:26 crc kubenswrapper[4825]: I1007 19:20:26.121121 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 07 19:20:26 crc kubenswrapper[4825]: I1007 19:20:26.127244 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 19:20:26 crc kubenswrapper[4825]: I1007 19:20:26.218574 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j969n\" (UniqueName: \"kubernetes.io/projected/395c3018-72fe-4e48-a92d-e98026e550a3-kube-api-access-j969n\") pod \"nova-cell0-conductor-0\" (UID: \"395c3018-72fe-4e48-a92d-e98026e550a3\") " pod="openstack/nova-cell0-conductor-0" Oct 07 19:20:26 crc kubenswrapper[4825]: I1007 19:20:26.218793 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/395c3018-72fe-4e48-a92d-e98026e550a3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"395c3018-72fe-4e48-a92d-e98026e550a3\") " pod="openstack/nova-cell0-conductor-0" Oct 07 19:20:26 crc kubenswrapper[4825]: I1007 19:20:26.218898 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/395c3018-72fe-4e48-a92d-e98026e550a3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"395c3018-72fe-4e48-a92d-e98026e550a3\") " pod="openstack/nova-cell0-conductor-0" Oct 07 19:20:26 crc kubenswrapper[4825]: I1007 19:20:26.321032 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j969n\" (UniqueName: \"kubernetes.io/projected/395c3018-72fe-4e48-a92d-e98026e550a3-kube-api-access-j969n\") pod \"nova-cell0-conductor-0\" (UID: \"395c3018-72fe-4e48-a92d-e98026e550a3\") " pod="openstack/nova-cell0-conductor-0" Oct 07 19:20:26 crc kubenswrapper[4825]: I1007 19:20:26.321389 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/395c3018-72fe-4e48-a92d-e98026e550a3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"395c3018-72fe-4e48-a92d-e98026e550a3\") " pod="openstack/nova-cell0-conductor-0" Oct 07 19:20:26 crc kubenswrapper[4825]: I1007 19:20:26.321428 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/395c3018-72fe-4e48-a92d-e98026e550a3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"395c3018-72fe-4e48-a92d-e98026e550a3\") " pod="openstack/nova-cell0-conductor-0" Oct 07 19:20:26 crc kubenswrapper[4825]: I1007 19:20:26.327103 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/395c3018-72fe-4e48-a92d-e98026e550a3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"395c3018-72fe-4e48-a92d-e98026e550a3\") " pod="openstack/nova-cell0-conductor-0" Oct 07 19:20:26 crc kubenswrapper[4825]: I1007 19:20:26.328429 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/395c3018-72fe-4e48-a92d-e98026e550a3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"395c3018-72fe-4e48-a92d-e98026e550a3\") " pod="openstack/nova-cell0-conductor-0" Oct 07 19:20:26 crc kubenswrapper[4825]: I1007 19:20:26.351284 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j969n\" (UniqueName: \"kubernetes.io/projected/395c3018-72fe-4e48-a92d-e98026e550a3-kube-api-access-j969n\") pod \"nova-cell0-conductor-0\" (UID: \"395c3018-72fe-4e48-a92d-e98026e550a3\") " pod="openstack/nova-cell0-conductor-0" Oct 07 19:20:26 crc kubenswrapper[4825]: I1007 19:20:26.436159 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 19:20:26 crc kubenswrapper[4825]: I1007 19:20:26.914458 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 19:20:26 crc kubenswrapper[4825]: W1007 19:20:26.927746 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod395c3018_72fe_4e48_a92d_e98026e550a3.slice/crio-e6e51688ab0eed2da8ecb6c8a99ceae321928aad4b76a40d0417f704575597c4 WatchSource:0}: Error finding container e6e51688ab0eed2da8ecb6c8a99ceae321928aad4b76a40d0417f704575597c4: Status 404 returned error can't find the container with id e6e51688ab0eed2da8ecb6c8a99ceae321928aad4b76a40d0417f704575597c4 Oct 07 19:20:27 crc kubenswrapper[4825]: I1007 19:20:27.027829 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"395c3018-72fe-4e48-a92d-e98026e550a3","Type":"ContainerStarted","Data":"e6e51688ab0eed2da8ecb6c8a99ceae321928aad4b76a40d0417f704575597c4"} Oct 07 19:20:28 crc kubenswrapper[4825]: I1007 19:20:28.061020 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"395c3018-72fe-4e48-a92d-e98026e550a3","Type":"ContainerStarted","Data":"fe9490d7a873b334a64e856bb59246e1b15dfc4dc50e7690277faeb40232ecf4"} Oct 07 19:20:28 crc kubenswrapper[4825]: I1007 19:20:28.061254 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 07 19:20:28 crc kubenswrapper[4825]: I1007 19:20:28.097316 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.097289846 podStartE2EDuration="2.097289846s" podCreationTimestamp="2025-10-07 19:20:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:20:28.080380049 +0000 UTC m=+1216.902418686" watchObservedRunningTime="2025-10-07 19:20:28.097289846 +0000 UTC m=+1216.919328513" Oct 07 19:20:35 crc kubenswrapper[4825]: I1007 19:20:35.708638 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:20:35 crc kubenswrapper[4825]: I1007 19:20:35.709343 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:20:35 crc kubenswrapper[4825]: I1007 19:20:35.709412 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:20:35 crc kubenswrapper[4825]: I1007 19:20:35.710330 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7a129d547f9c2f005540980fa89f701d13b633e45c1d0e5a234b2420081b437f"} pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 19:20:35 crc kubenswrapper[4825]: I1007 19:20:35.710437 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" containerID="cri-o://7a129d547f9c2f005540980fa89f701d13b633e45c1d0e5a234b2420081b437f" gracePeriod=600 Oct 07 19:20:36 crc kubenswrapper[4825]: I1007 19:20:36.151086 4825 generic.go:334] "Generic (PLEG): container finished" podID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerID="7a129d547f9c2f005540980fa89f701d13b633e45c1d0e5a234b2420081b437f" exitCode=0 Oct 07 19:20:36 crc kubenswrapper[4825]: I1007 19:20:36.151135 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerDied","Data":"7a129d547f9c2f005540980fa89f701d13b633e45c1d0e5a234b2420081b437f"} Oct 07 19:20:36 crc kubenswrapper[4825]: I1007 19:20:36.151168 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerStarted","Data":"bb100741ed4991dde5bb64dc9ab561e9fa009739dcc0f0f2c8261720803021e4"} Oct 07 19:20:36 crc kubenswrapper[4825]: I1007 19:20:36.151187 4825 scope.go:117] "RemoveContainer" containerID="906a228c17f7770f9388dbe04c2d4927f00d2c55a2ece21a3cd466abc03da78e" Oct 07 19:20:36 crc kubenswrapper[4825]: I1007 19:20:36.484281 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 07 19:20:36 crc kubenswrapper[4825]: I1007 19:20:36.971953 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-4nvlv"] Oct 07 19:20:36 crc kubenswrapper[4825]: I1007 19:20:36.974001 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4nvlv" Oct 07 19:20:36 crc kubenswrapper[4825]: I1007 19:20:36.976861 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 07 19:20:36 crc kubenswrapper[4825]: I1007 19:20:36.977091 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 07 19:20:36 crc kubenswrapper[4825]: I1007 19:20:36.983962 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-4nvlv"] Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.026297 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83137b5-f576-4a66-967b-ccdef3af6897-config-data\") pod \"nova-cell0-cell-mapping-4nvlv\" (UID: \"e83137b5-f576-4a66-967b-ccdef3af6897\") " pod="openstack/nova-cell0-cell-mapping-4nvlv" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.026386 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83137b5-f576-4a66-967b-ccdef3af6897-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4nvlv\" (UID: \"e83137b5-f576-4a66-967b-ccdef3af6897\") " pod="openstack/nova-cell0-cell-mapping-4nvlv" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.026422 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83137b5-f576-4a66-967b-ccdef3af6897-scripts\") pod \"nova-cell0-cell-mapping-4nvlv\" (UID: \"e83137b5-f576-4a66-967b-ccdef3af6897\") " pod="openstack/nova-cell0-cell-mapping-4nvlv" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.026503 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99g9b\" (UniqueName: \"kubernetes.io/projected/e83137b5-f576-4a66-967b-ccdef3af6897-kube-api-access-99g9b\") pod \"nova-cell0-cell-mapping-4nvlv\" (UID: \"e83137b5-f576-4a66-967b-ccdef3af6897\") " pod="openstack/nova-cell0-cell-mapping-4nvlv" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.127939 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99g9b\" (UniqueName: \"kubernetes.io/projected/e83137b5-f576-4a66-967b-ccdef3af6897-kube-api-access-99g9b\") pod \"nova-cell0-cell-mapping-4nvlv\" (UID: \"e83137b5-f576-4a66-967b-ccdef3af6897\") " pod="openstack/nova-cell0-cell-mapping-4nvlv" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.128381 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.128462 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83137b5-f576-4a66-967b-ccdef3af6897-config-data\") pod \"nova-cell0-cell-mapping-4nvlv\" (UID: \"e83137b5-f576-4a66-967b-ccdef3af6897\") " pod="openstack/nova-cell0-cell-mapping-4nvlv" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.128538 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83137b5-f576-4a66-967b-ccdef3af6897-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4nvlv\" (UID: \"e83137b5-f576-4a66-967b-ccdef3af6897\") " pod="openstack/nova-cell0-cell-mapping-4nvlv" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.128573 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83137b5-f576-4a66-967b-ccdef3af6897-scripts\") pod \"nova-cell0-cell-mapping-4nvlv\" (UID: \"e83137b5-f576-4a66-967b-ccdef3af6897\") " pod="openstack/nova-cell0-cell-mapping-4nvlv" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.129753 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.135912 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83137b5-f576-4a66-967b-ccdef3af6897-config-data\") pod \"nova-cell0-cell-mapping-4nvlv\" (UID: \"e83137b5-f576-4a66-967b-ccdef3af6897\") " pod="openstack/nova-cell0-cell-mapping-4nvlv" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.136629 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.136901 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83137b5-f576-4a66-967b-ccdef3af6897-scripts\") pod \"nova-cell0-cell-mapping-4nvlv\" (UID: \"e83137b5-f576-4a66-967b-ccdef3af6897\") " pod="openstack/nova-cell0-cell-mapping-4nvlv" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.137163 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83137b5-f576-4a66-967b-ccdef3af6897-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4nvlv\" (UID: \"e83137b5-f576-4a66-967b-ccdef3af6897\") " pod="openstack/nova-cell0-cell-mapping-4nvlv" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.160648 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.180926 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99g9b\" (UniqueName: \"kubernetes.io/projected/e83137b5-f576-4a66-967b-ccdef3af6897-kube-api-access-99g9b\") pod \"nova-cell0-cell-mapping-4nvlv\" (UID: \"e83137b5-f576-4a66-967b-ccdef3af6897\") " pod="openstack/nova-cell0-cell-mapping-4nvlv" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.220808 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.222396 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.224931 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.229643 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptxzf\" (UniqueName: \"kubernetes.io/projected/e3270262-7d96-4271-8be1-c01393ae7bc4-kube-api-access-ptxzf\") pod \"nova-api-0\" (UID: \"e3270262-7d96-4271-8be1-c01393ae7bc4\") " pod="openstack/nova-api-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.229699 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3270262-7d96-4271-8be1-c01393ae7bc4-logs\") pod \"nova-api-0\" (UID: \"e3270262-7d96-4271-8be1-c01393ae7bc4\") " pod="openstack/nova-api-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.229895 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3270262-7d96-4271-8be1-c01393ae7bc4-config-data\") pod \"nova-api-0\" (UID: \"e3270262-7d96-4271-8be1-c01393ae7bc4\") " pod="openstack/nova-api-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.230016 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3270262-7d96-4271-8be1-c01393ae7bc4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e3270262-7d96-4271-8be1-c01393ae7bc4\") " pod="openstack/nova-api-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.248815 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.311353 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4nvlv" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.332147 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.332344 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3270262-7d96-4271-8be1-c01393ae7bc4-config-data\") pod \"nova-api-0\" (UID: \"e3270262-7d96-4271-8be1-c01393ae7bc4\") " pod="openstack/nova-api-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.332454 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29hkl\" (UniqueName: \"kubernetes.io/projected/a92ef064-089b-4c42-b0b4-dcd9c54d75a3-kube-api-access-29hkl\") pod \"nova-scheduler-0\" (UID: \"a92ef064-089b-4c42-b0b4-dcd9c54d75a3\") " pod="openstack/nova-scheduler-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.332504 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3270262-7d96-4271-8be1-c01393ae7bc4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e3270262-7d96-4271-8be1-c01393ae7bc4\") " pod="openstack/nova-api-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.332593 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptxzf\" (UniqueName: \"kubernetes.io/projected/e3270262-7d96-4271-8be1-c01393ae7bc4-kube-api-access-ptxzf\") pod \"nova-api-0\" (UID: \"e3270262-7d96-4271-8be1-c01393ae7bc4\") " pod="openstack/nova-api-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.332645 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92ef064-089b-4c42-b0b4-dcd9c54d75a3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a92ef064-089b-4c42-b0b4-dcd9c54d75a3\") " pod="openstack/nova-scheduler-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.332676 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3270262-7d96-4271-8be1-c01393ae7bc4-logs\") pod \"nova-api-0\" (UID: \"e3270262-7d96-4271-8be1-c01393ae7bc4\") " pod="openstack/nova-api-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.332706 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92ef064-089b-4c42-b0b4-dcd9c54d75a3-config-data\") pod \"nova-scheduler-0\" (UID: \"a92ef064-089b-4c42-b0b4-dcd9c54d75a3\") " pod="openstack/nova-scheduler-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.334377 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3270262-7d96-4271-8be1-c01393ae7bc4-logs\") pod \"nova-api-0\" (UID: \"e3270262-7d96-4271-8be1-c01393ae7bc4\") " pod="openstack/nova-api-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.337831 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3270262-7d96-4271-8be1-c01393ae7bc4-config-data\") pod \"nova-api-0\" (UID: \"e3270262-7d96-4271-8be1-c01393ae7bc4\") " pod="openstack/nova-api-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.340830 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3270262-7d96-4271-8be1-c01393ae7bc4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e3270262-7d96-4271-8be1-c01393ae7bc4\") " pod="openstack/nova-api-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.342723 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.347960 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.383774 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptxzf\" (UniqueName: \"kubernetes.io/projected/e3270262-7d96-4271-8be1-c01393ae7bc4-kube-api-access-ptxzf\") pod \"nova-api-0\" (UID: \"e3270262-7d96-4271-8be1-c01393ae7bc4\") " pod="openstack/nova-api-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.383836 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.436283 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92ef064-089b-4c42-b0b4-dcd9c54d75a3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a92ef064-089b-4c42-b0b4-dcd9c54d75a3\") " pod="openstack/nova-scheduler-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.436342 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66a9241a-965c-4942-9628-d09795a60b56-logs\") pod \"nova-metadata-0\" (UID: \"66a9241a-965c-4942-9628-d09795a60b56\") " pod="openstack/nova-metadata-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.436369 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92ef064-089b-4c42-b0b4-dcd9c54d75a3-config-data\") pod \"nova-scheduler-0\" (UID: \"a92ef064-089b-4c42-b0b4-dcd9c54d75a3\") " pod="openstack/nova-scheduler-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.436459 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66a9241a-965c-4942-9628-d09795a60b56-config-data\") pod \"nova-metadata-0\" (UID: \"66a9241a-965c-4942-9628-d09795a60b56\") " pod="openstack/nova-metadata-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.436492 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66a9241a-965c-4942-9628-d09795a60b56-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"66a9241a-965c-4942-9628-d09795a60b56\") " pod="openstack/nova-metadata-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.436521 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29hkl\" (UniqueName: \"kubernetes.io/projected/a92ef064-089b-4c42-b0b4-dcd9c54d75a3-kube-api-access-29hkl\") pod \"nova-scheduler-0\" (UID: \"a92ef064-089b-4c42-b0b4-dcd9c54d75a3\") " pod="openstack/nova-scheduler-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.436618 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9jbg\" (UniqueName: \"kubernetes.io/projected/66a9241a-965c-4942-9628-d09795a60b56-kube-api-access-x9jbg\") pod \"nova-metadata-0\" (UID: \"66a9241a-965c-4942-9628-d09795a60b56\") " pod="openstack/nova-metadata-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.440087 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92ef064-089b-4c42-b0b4-dcd9c54d75a3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a92ef064-089b-4c42-b0b4-dcd9c54d75a3\") " pod="openstack/nova-scheduler-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.442720 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92ef064-089b-4c42-b0b4-dcd9c54d75a3-config-data\") pod \"nova-scheduler-0\" (UID: \"a92ef064-089b-4c42-b0b4-dcd9c54d75a3\") " pod="openstack/nova-scheduler-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.444388 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.445811 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.448352 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.474192 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29hkl\" (UniqueName: \"kubernetes.io/projected/a92ef064-089b-4c42-b0b4-dcd9c54d75a3-kube-api-access-29hkl\") pod \"nova-scheduler-0\" (UID: \"a92ef064-089b-4c42-b0b4-dcd9c54d75a3\") " pod="openstack/nova-scheduler-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.479411 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.541556 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-z2rbb"] Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.568702 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9jbg\" (UniqueName: \"kubernetes.io/projected/66a9241a-965c-4942-9628-d09795a60b56-kube-api-access-x9jbg\") pod \"nova-metadata-0\" (UID: \"66a9241a-965c-4942-9628-d09795a60b56\") " pod="openstack/nova-metadata-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.568753 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc70b027-cab7-490d-a529-e74243d17ede-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc70b027-cab7-490d-a529-e74243d17ede\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.568885 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66a9241a-965c-4942-9628-d09795a60b56-logs\") pod \"nova-metadata-0\" (UID: \"66a9241a-965c-4942-9628-d09795a60b56\") " pod="openstack/nova-metadata-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.569076 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66a9241a-965c-4942-9628-d09795a60b56-config-data\") pod \"nova-metadata-0\" (UID: \"66a9241a-965c-4942-9628-d09795a60b56\") " pod="openstack/nova-metadata-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.569121 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66a9241a-965c-4942-9628-d09795a60b56-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"66a9241a-965c-4942-9628-d09795a60b56\") " pod="openstack/nova-metadata-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.569199 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc70b027-cab7-490d-a529-e74243d17ede-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc70b027-cab7-490d-a529-e74243d17ede\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.569271 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tctd\" (UniqueName: \"kubernetes.io/projected/bc70b027-cab7-490d-a529-e74243d17ede-kube-api-access-9tctd\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc70b027-cab7-490d-a529-e74243d17ede\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.570189 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66a9241a-965c-4942-9628-d09795a60b56-logs\") pod \"nova-metadata-0\" (UID: \"66a9241a-965c-4942-9628-d09795a60b56\") " pod="openstack/nova-metadata-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.577957 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.585068 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66a9241a-965c-4942-9628-d09795a60b56-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"66a9241a-965c-4942-9628-d09795a60b56\") " pod="openstack/nova-metadata-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.593790 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9jbg\" (UniqueName: \"kubernetes.io/projected/66a9241a-965c-4942-9628-d09795a60b56-kube-api-access-x9jbg\") pod \"nova-metadata-0\" (UID: \"66a9241a-965c-4942-9628-d09795a60b56\") " pod="openstack/nova-metadata-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.602052 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-z2rbb"] Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.602151 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.603126 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.609753 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66a9241a-965c-4942-9628-d09795a60b56-config-data\") pod \"nova-metadata-0\" (UID: \"66a9241a-965c-4942-9628-d09795a60b56\") " pod="openstack/nova-metadata-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.682713 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-z2rbb\" (UID: \"4c75272c-4299-451d-8fcf-82204dc97b14\") " pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.682766 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc70b027-cab7-490d-a529-e74243d17ede-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc70b027-cab7-490d-a529-e74243d17ede\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.682787 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzshn\" (UniqueName: \"kubernetes.io/projected/4c75272c-4299-451d-8fcf-82204dc97b14-kube-api-access-tzshn\") pod \"dnsmasq-dns-845d6d6f59-z2rbb\" (UID: \"4c75272c-4299-451d-8fcf-82204dc97b14\") " pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.682864 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-z2rbb\" (UID: \"4c75272c-4299-451d-8fcf-82204dc97b14\") " pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.682893 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-z2rbb\" (UID: \"4c75272c-4299-451d-8fcf-82204dc97b14\") " pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.682910 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-config\") pod \"dnsmasq-dns-845d6d6f59-z2rbb\" (UID: \"4c75272c-4299-451d-8fcf-82204dc97b14\") " pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.682950 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-z2rbb\" (UID: \"4c75272c-4299-451d-8fcf-82204dc97b14\") " pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.682967 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc70b027-cab7-490d-a529-e74243d17ede-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc70b027-cab7-490d-a529-e74243d17ede\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.682992 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tctd\" (UniqueName: \"kubernetes.io/projected/bc70b027-cab7-490d-a529-e74243d17ede-kube-api-access-9tctd\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc70b027-cab7-490d-a529-e74243d17ede\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.698907 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc70b027-cab7-490d-a529-e74243d17ede-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc70b027-cab7-490d-a529-e74243d17ede\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.705891 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc70b027-cab7-490d-a529-e74243d17ede-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc70b027-cab7-490d-a529-e74243d17ede\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.744754 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tctd\" (UniqueName: \"kubernetes.io/projected/bc70b027-cab7-490d-a529-e74243d17ede-kube-api-access-9tctd\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc70b027-cab7-490d-a529-e74243d17ede\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.784607 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-z2rbb\" (UID: \"4c75272c-4299-451d-8fcf-82204dc97b14\") " pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.784862 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-z2rbb\" (UID: \"4c75272c-4299-451d-8fcf-82204dc97b14\") " pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.784890 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzshn\" (UniqueName: \"kubernetes.io/projected/4c75272c-4299-451d-8fcf-82204dc97b14-kube-api-access-tzshn\") pod \"dnsmasq-dns-845d6d6f59-z2rbb\" (UID: \"4c75272c-4299-451d-8fcf-82204dc97b14\") " pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.784969 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-z2rbb\" (UID: \"4c75272c-4299-451d-8fcf-82204dc97b14\") " pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.785004 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-z2rbb\" (UID: \"4c75272c-4299-451d-8fcf-82204dc97b14\") " pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.785024 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-config\") pod \"dnsmasq-dns-845d6d6f59-z2rbb\" (UID: \"4c75272c-4299-451d-8fcf-82204dc97b14\") " pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.785849 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-z2rbb\" (UID: \"4c75272c-4299-451d-8fcf-82204dc97b14\") " pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.785887 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-config\") pod \"dnsmasq-dns-845d6d6f59-z2rbb\" (UID: \"4c75272c-4299-451d-8fcf-82204dc97b14\") " pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.785949 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-z2rbb\" (UID: \"4c75272c-4299-451d-8fcf-82204dc97b14\") " pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.785969 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-z2rbb\" (UID: \"4c75272c-4299-451d-8fcf-82204dc97b14\") " pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.786425 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-z2rbb\" (UID: \"4c75272c-4299-451d-8fcf-82204dc97b14\") " pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.815045 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzshn\" (UniqueName: \"kubernetes.io/projected/4c75272c-4299-451d-8fcf-82204dc97b14-kube-api-access-tzshn\") pod \"dnsmasq-dns-845d6d6f59-z2rbb\" (UID: \"4c75272c-4299-451d-8fcf-82204dc97b14\") " pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.841535 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 19:20:37 crc kubenswrapper[4825]: I1007 19:20:37.878220 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.009012 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.037094 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-4nvlv"] Oct 07 19:20:38 crc kubenswrapper[4825]: W1007 19:20:38.047490 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode83137b5_f576_4a66_967b_ccdef3af6897.slice/crio-d5d046fa084a14bded94246a90eaa08cc1953e53e4fd62ca1f902642e8749c56 WatchSource:0}: Error finding container d5d046fa084a14bded94246a90eaa08cc1953e53e4fd62ca1f902642e8749c56: Status 404 returned error can't find the container with id d5d046fa084a14bded94246a90eaa08cc1953e53e4fd62ca1f902642e8749c56 Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.178934 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4nvlv" event={"ID":"e83137b5-f576-4a66-967b-ccdef3af6897","Type":"ContainerStarted","Data":"d5d046fa084a14bded94246a90eaa08cc1953e53e4fd62ca1f902642e8749c56"} Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.232322 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 19:20:38 crc kubenswrapper[4825]: W1007 19:20:38.243275 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda92ef064_089b_4c42_b0b4_dcd9c54d75a3.slice/crio-0aaba32f719bfacfe0a09efa5241286a96bba9858d1ff91ece2cd2a464766c34 WatchSource:0}: Error finding container 0aaba32f719bfacfe0a09efa5241286a96bba9858d1ff91ece2cd2a464766c34: Status 404 returned error can't find the container with id 0aaba32f719bfacfe0a09efa5241286a96bba9858d1ff91ece2cd2a464766c34 Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.304751 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ntw92"] Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.307192 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ntw92" Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.312653 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.315186 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.348052 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ntw92"] Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.363176 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.398915 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rlk2\" (UniqueName: \"kubernetes.io/projected/67d7cb41-3e2e-4583-8552-665f52d70bc7-kube-api-access-8rlk2\") pod \"nova-cell1-conductor-db-sync-ntw92\" (UID: \"67d7cb41-3e2e-4583-8552-665f52d70bc7\") " pod="openstack/nova-cell1-conductor-db-sync-ntw92" Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.398972 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d7cb41-3e2e-4583-8552-665f52d70bc7-config-data\") pod \"nova-cell1-conductor-db-sync-ntw92\" (UID: \"67d7cb41-3e2e-4583-8552-665f52d70bc7\") " pod="openstack/nova-cell1-conductor-db-sync-ntw92" Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.398990 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67d7cb41-3e2e-4583-8552-665f52d70bc7-scripts\") pod \"nova-cell1-conductor-db-sync-ntw92\" (UID: \"67d7cb41-3e2e-4583-8552-665f52d70bc7\") " pod="openstack/nova-cell1-conductor-db-sync-ntw92" Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.399127 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d7cb41-3e2e-4583-8552-665f52d70bc7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ntw92\" (UID: \"67d7cb41-3e2e-4583-8552-665f52d70bc7\") " pod="openstack/nova-cell1-conductor-db-sync-ntw92" Oct 07 19:20:38 crc kubenswrapper[4825]: W1007 19:20:38.404905 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c75272c_4299_451d_8fcf_82204dc97b14.slice/crio-323a2ce2aad512d87cfec08db506ce66b25f5384086d09dbda13288f91bba973 WatchSource:0}: Error finding container 323a2ce2aad512d87cfec08db506ce66b25f5384086d09dbda13288f91bba973: Status 404 returned error can't find the container with id 323a2ce2aad512d87cfec08db506ce66b25f5384086d09dbda13288f91bba973 Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.429983 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-z2rbb"] Oct 07 19:20:38 crc kubenswrapper[4825]: W1007 19:20:38.430403 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc70b027_cab7_490d_a529_e74243d17ede.slice/crio-cad22c313c51d3a2ebfc0ee6e42dc44321795c31947ca91e1680be8da379644b WatchSource:0}: Error finding container cad22c313c51d3a2ebfc0ee6e42dc44321795c31947ca91e1680be8da379644b: Status 404 returned error can't find the container with id cad22c313c51d3a2ebfc0ee6e42dc44321795c31947ca91e1680be8da379644b Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.442063 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.448483 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 19:20:38 crc kubenswrapper[4825]: W1007 19:20:38.451143 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66a9241a_965c_4942_9628_d09795a60b56.slice/crio-3c1c8101fc70797f3dca8098774354876aa8db82b964cb6ac2d5bf857f1cd311 WatchSource:0}: Error finding container 3c1c8101fc70797f3dca8098774354876aa8db82b964cb6ac2d5bf857f1cd311: Status 404 returned error can't find the container with id 3c1c8101fc70797f3dca8098774354876aa8db82b964cb6ac2d5bf857f1cd311 Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.502362 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rlk2\" (UniqueName: \"kubernetes.io/projected/67d7cb41-3e2e-4583-8552-665f52d70bc7-kube-api-access-8rlk2\") pod \"nova-cell1-conductor-db-sync-ntw92\" (UID: \"67d7cb41-3e2e-4583-8552-665f52d70bc7\") " pod="openstack/nova-cell1-conductor-db-sync-ntw92" Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.502423 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d7cb41-3e2e-4583-8552-665f52d70bc7-config-data\") pod \"nova-cell1-conductor-db-sync-ntw92\" (UID: \"67d7cb41-3e2e-4583-8552-665f52d70bc7\") " pod="openstack/nova-cell1-conductor-db-sync-ntw92" Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.502441 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67d7cb41-3e2e-4583-8552-665f52d70bc7-scripts\") pod \"nova-cell1-conductor-db-sync-ntw92\" (UID: \"67d7cb41-3e2e-4583-8552-665f52d70bc7\") " pod="openstack/nova-cell1-conductor-db-sync-ntw92" Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.502469 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d7cb41-3e2e-4583-8552-665f52d70bc7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ntw92\" (UID: \"67d7cb41-3e2e-4583-8552-665f52d70bc7\") " pod="openstack/nova-cell1-conductor-db-sync-ntw92" Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.506616 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67d7cb41-3e2e-4583-8552-665f52d70bc7-scripts\") pod \"nova-cell1-conductor-db-sync-ntw92\" (UID: \"67d7cb41-3e2e-4583-8552-665f52d70bc7\") " pod="openstack/nova-cell1-conductor-db-sync-ntw92" Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.506728 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d7cb41-3e2e-4583-8552-665f52d70bc7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ntw92\" (UID: \"67d7cb41-3e2e-4583-8552-665f52d70bc7\") " pod="openstack/nova-cell1-conductor-db-sync-ntw92" Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.509869 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d7cb41-3e2e-4583-8552-665f52d70bc7-config-data\") pod \"nova-cell1-conductor-db-sync-ntw92\" (UID: \"67d7cb41-3e2e-4583-8552-665f52d70bc7\") " pod="openstack/nova-cell1-conductor-db-sync-ntw92" Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.522116 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rlk2\" (UniqueName: \"kubernetes.io/projected/67d7cb41-3e2e-4583-8552-665f52d70bc7-kube-api-access-8rlk2\") pod \"nova-cell1-conductor-db-sync-ntw92\" (UID: \"67d7cb41-3e2e-4583-8552-665f52d70bc7\") " pod="openstack/nova-cell1-conductor-db-sync-ntw92" Oct 07 19:20:38 crc kubenswrapper[4825]: I1007 19:20:38.743824 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ntw92" Oct 07 19:20:39 crc kubenswrapper[4825]: I1007 19:20:39.204703 4825 generic.go:334] "Generic (PLEG): container finished" podID="4c75272c-4299-451d-8fcf-82204dc97b14" containerID="322f2c205fb06cfdc446834b51771e7e1be2cf388fa434fce126d96b3348cd7e" exitCode=0 Oct 07 19:20:39 crc kubenswrapper[4825]: I1007 19:20:39.204799 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" event={"ID":"4c75272c-4299-451d-8fcf-82204dc97b14","Type":"ContainerDied","Data":"322f2c205fb06cfdc446834b51771e7e1be2cf388fa434fce126d96b3348cd7e"} Oct 07 19:20:39 crc kubenswrapper[4825]: I1007 19:20:39.204852 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" event={"ID":"4c75272c-4299-451d-8fcf-82204dc97b14","Type":"ContainerStarted","Data":"323a2ce2aad512d87cfec08db506ce66b25f5384086d09dbda13288f91bba973"} Oct 07 19:20:39 crc kubenswrapper[4825]: I1007 19:20:39.207189 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4nvlv" event={"ID":"e83137b5-f576-4a66-967b-ccdef3af6897","Type":"ContainerStarted","Data":"4e71f56cbdda4edd6fc279091f95eda084e94d2938c7086ddecacab8a2937970"} Oct 07 19:20:39 crc kubenswrapper[4825]: I1007 19:20:39.207446 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ntw92"] Oct 07 19:20:39 crc kubenswrapper[4825]: I1007 19:20:39.235973 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3270262-7d96-4271-8be1-c01393ae7bc4","Type":"ContainerStarted","Data":"608d889bf635c388ae60cd35da8633b56e012804638e08a91c8791bcea6b96db"} Oct 07 19:20:39 crc kubenswrapper[4825]: I1007 19:20:39.242505 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bc70b027-cab7-490d-a529-e74243d17ede","Type":"ContainerStarted","Data":"cad22c313c51d3a2ebfc0ee6e42dc44321795c31947ca91e1680be8da379644b"} Oct 07 19:20:39 crc kubenswrapper[4825]: I1007 19:20:39.245385 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a92ef064-089b-4c42-b0b4-dcd9c54d75a3","Type":"ContainerStarted","Data":"0aaba32f719bfacfe0a09efa5241286a96bba9858d1ff91ece2cd2a464766c34"} Oct 07 19:20:39 crc kubenswrapper[4825]: I1007 19:20:39.264631 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66a9241a-965c-4942-9628-d09795a60b56","Type":"ContainerStarted","Data":"3c1c8101fc70797f3dca8098774354876aa8db82b964cb6ac2d5bf857f1cd311"} Oct 07 19:20:39 crc kubenswrapper[4825]: I1007 19:20:39.271858 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-4nvlv" podStartSLOduration=3.2718440060000002 podStartE2EDuration="3.271844006s" podCreationTimestamp="2025-10-07 19:20:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:20:39.242731731 +0000 UTC m=+1228.064770388" watchObservedRunningTime="2025-10-07 19:20:39.271844006 +0000 UTC m=+1228.093882643" Oct 07 19:20:40 crc kubenswrapper[4825]: I1007 19:20:40.275421 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ntw92" event={"ID":"67d7cb41-3e2e-4583-8552-665f52d70bc7","Type":"ContainerStarted","Data":"43deadf38680adb256c4ca4664d6b3186ab3de0e6a2559a1fdad8d7494a8ebdd"} Oct 07 19:20:40 crc kubenswrapper[4825]: I1007 19:20:40.275736 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ntw92" event={"ID":"67d7cb41-3e2e-4583-8552-665f52d70bc7","Type":"ContainerStarted","Data":"4f62b6555432aee077b3ca596b19e211efd191968ceaa239e97788081926a09f"} Oct 07 19:20:40 crc kubenswrapper[4825]: I1007 19:20:40.281379 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" event={"ID":"4c75272c-4299-451d-8fcf-82204dc97b14","Type":"ContainerStarted","Data":"42711cfc8a81e917fed2e13fd9024ac1f313c82d8d697309eab32f85869245da"} Oct 07 19:20:40 crc kubenswrapper[4825]: I1007 19:20:40.281423 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" Oct 07 19:20:40 crc kubenswrapper[4825]: I1007 19:20:40.323523 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" podStartSLOduration=3.323498781 podStartE2EDuration="3.323498781s" podCreationTimestamp="2025-10-07 19:20:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:20:40.317751228 +0000 UTC m=+1229.139789865" watchObservedRunningTime="2025-10-07 19:20:40.323498781 +0000 UTC m=+1229.145537418" Oct 07 19:20:40 crc kubenswrapper[4825]: I1007 19:20:40.326500 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-ntw92" podStartSLOduration=2.326475626 podStartE2EDuration="2.326475626s" podCreationTimestamp="2025-10-07 19:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:20:40.296748402 +0000 UTC m=+1229.118787059" watchObservedRunningTime="2025-10-07 19:20:40.326475626 +0000 UTC m=+1229.148514283" Oct 07 19:20:41 crc kubenswrapper[4825]: I1007 19:20:41.501911 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 19:20:41 crc kubenswrapper[4825]: I1007 19:20:41.512800 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 19:20:42 crc kubenswrapper[4825]: I1007 19:20:42.304549 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a92ef064-089b-4c42-b0b4-dcd9c54d75a3","Type":"ContainerStarted","Data":"966133ceb48dc55ea1124a42f971ca7bf564ecadfed153496ac2559fed45aee7"} Oct 07 19:20:42 crc kubenswrapper[4825]: I1007 19:20:42.307148 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66a9241a-965c-4942-9628-d09795a60b56","Type":"ContainerStarted","Data":"11c8b8c7c8732710962d56818fd5650636ac9072a20f2e9660e87c042bba016b"} Oct 07 19:20:42 crc kubenswrapper[4825]: I1007 19:20:42.307176 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66a9241a-965c-4942-9628-d09795a60b56","Type":"ContainerStarted","Data":"5f9b935151b2040f55fb5637402949c132fd30b243dbe2310e8de29f5f51b1b9"} Oct 07 19:20:42 crc kubenswrapper[4825]: I1007 19:20:42.307268 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="66a9241a-965c-4942-9628-d09795a60b56" containerName="nova-metadata-log" containerID="cri-o://5f9b935151b2040f55fb5637402949c132fd30b243dbe2310e8de29f5f51b1b9" gracePeriod=30 Oct 07 19:20:42 crc kubenswrapper[4825]: I1007 19:20:42.307321 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="66a9241a-965c-4942-9628-d09795a60b56" containerName="nova-metadata-metadata" containerID="cri-o://11c8b8c7c8732710962d56818fd5650636ac9072a20f2e9660e87c042bba016b" gracePeriod=30 Oct 07 19:20:42 crc kubenswrapper[4825]: I1007 19:20:42.310547 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3270262-7d96-4271-8be1-c01393ae7bc4","Type":"ContainerStarted","Data":"83cb41800a302d9b3d62768b0927e59bf89153bdb24a284c364aa4a3bbc2f35d"} Oct 07 19:20:42 crc kubenswrapper[4825]: I1007 19:20:42.310579 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3270262-7d96-4271-8be1-c01393ae7bc4","Type":"ContainerStarted","Data":"cb7af0619ecda6bd3774bf7b7e15f5fb0ea03c7195bbf641571c746825be6513"} Oct 07 19:20:42 crc kubenswrapper[4825]: I1007 19:20:42.313381 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bc70b027-cab7-490d-a529-e74243d17ede","Type":"ContainerStarted","Data":"18aecafa05947d211ade57c22b90f108df42f5bd60805ded5381dbd3ecc06747"} Oct 07 19:20:42 crc kubenswrapper[4825]: I1007 19:20:42.313526 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="bc70b027-cab7-490d-a529-e74243d17ede" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://18aecafa05947d211ade57c22b90f108df42f5bd60805ded5381dbd3ecc06747" gracePeriod=30 Oct 07 19:20:42 crc kubenswrapper[4825]: I1007 19:20:42.335833 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.415959101 podStartE2EDuration="5.335811482s" podCreationTimestamp="2025-10-07 19:20:37 +0000 UTC" firstStartedPulling="2025-10-07 19:20:38.259418716 +0000 UTC m=+1227.081457343" lastFinishedPulling="2025-10-07 19:20:41.179271087 +0000 UTC m=+1230.001309724" observedRunningTime="2025-10-07 19:20:42.330265916 +0000 UTC m=+1231.152304553" watchObservedRunningTime="2025-10-07 19:20:42.335811482 +0000 UTC m=+1231.157850119" Oct 07 19:20:42 crc kubenswrapper[4825]: I1007 19:20:42.364793 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.51759666 podStartE2EDuration="5.364779112s" podCreationTimestamp="2025-10-07 19:20:37 +0000 UTC" firstStartedPulling="2025-10-07 19:20:38.329332087 +0000 UTC m=+1227.151370724" lastFinishedPulling="2025-10-07 19:20:41.176514539 +0000 UTC m=+1229.998553176" observedRunningTime="2025-10-07 19:20:42.361693574 +0000 UTC m=+1231.183732211" watchObservedRunningTime="2025-10-07 19:20:42.364779112 +0000 UTC m=+1231.186817749" Oct 07 19:20:42 crc kubenswrapper[4825]: I1007 19:20:42.380881 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.633430708 podStartE2EDuration="5.380869443s" podCreationTimestamp="2025-10-07 19:20:37 +0000 UTC" firstStartedPulling="2025-10-07 19:20:38.433403621 +0000 UTC m=+1227.255442258" lastFinishedPulling="2025-10-07 19:20:41.180842366 +0000 UTC m=+1230.002880993" observedRunningTime="2025-10-07 19:20:42.379048385 +0000 UTC m=+1231.201087022" watchObservedRunningTime="2025-10-07 19:20:42.380869443 +0000 UTC m=+1231.202908080" Oct 07 19:20:42 crc kubenswrapper[4825]: I1007 19:20:42.413070 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.682200486 podStartE2EDuration="5.413046145s" podCreationTimestamp="2025-10-07 19:20:37 +0000 UTC" firstStartedPulling="2025-10-07 19:20:38.452814337 +0000 UTC m=+1227.274852974" lastFinishedPulling="2025-10-07 19:20:41.183659996 +0000 UTC m=+1230.005698633" observedRunningTime="2025-10-07 19:20:42.403009986 +0000 UTC m=+1231.225048613" watchObservedRunningTime="2025-10-07 19:20:42.413046145 +0000 UTC m=+1231.235084812" Oct 07 19:20:42 crc kubenswrapper[4825]: I1007 19:20:42.603538 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 19:20:42 crc kubenswrapper[4825]: I1007 19:20:42.842156 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 19:20:42 crc kubenswrapper[4825]: I1007 19:20:42.842260 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 19:20:42 crc kubenswrapper[4825]: I1007 19:20:42.879378 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:20:43 crc kubenswrapper[4825]: I1007 19:20:43.323048 4825 generic.go:334] "Generic (PLEG): container finished" podID="66a9241a-965c-4942-9628-d09795a60b56" containerID="11c8b8c7c8732710962d56818fd5650636ac9072a20f2e9660e87c042bba016b" exitCode=0 Oct 07 19:20:43 crc kubenswrapper[4825]: I1007 19:20:43.323376 4825 generic.go:334] "Generic (PLEG): container finished" podID="66a9241a-965c-4942-9628-d09795a60b56" containerID="5f9b935151b2040f55fb5637402949c132fd30b243dbe2310e8de29f5f51b1b9" exitCode=143 Oct 07 19:20:43 crc kubenswrapper[4825]: I1007 19:20:43.323121 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66a9241a-965c-4942-9628-d09795a60b56","Type":"ContainerDied","Data":"11c8b8c7c8732710962d56818fd5650636ac9072a20f2e9660e87c042bba016b"} Oct 07 19:20:43 crc kubenswrapper[4825]: I1007 19:20:43.323431 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66a9241a-965c-4942-9628-d09795a60b56","Type":"ContainerDied","Data":"5f9b935151b2040f55fb5637402949c132fd30b243dbe2310e8de29f5f51b1b9"} Oct 07 19:20:43 crc kubenswrapper[4825]: I1007 19:20:43.438932 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 19:20:43 crc kubenswrapper[4825]: I1007 19:20:43.533414 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9jbg\" (UniqueName: \"kubernetes.io/projected/66a9241a-965c-4942-9628-d09795a60b56-kube-api-access-x9jbg\") pod \"66a9241a-965c-4942-9628-d09795a60b56\" (UID: \"66a9241a-965c-4942-9628-d09795a60b56\") " Oct 07 19:20:43 crc kubenswrapper[4825]: I1007 19:20:43.533628 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66a9241a-965c-4942-9628-d09795a60b56-config-data\") pod \"66a9241a-965c-4942-9628-d09795a60b56\" (UID: \"66a9241a-965c-4942-9628-d09795a60b56\") " Oct 07 19:20:43 crc kubenswrapper[4825]: I1007 19:20:43.533651 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66a9241a-965c-4942-9628-d09795a60b56-logs\") pod \"66a9241a-965c-4942-9628-d09795a60b56\" (UID: \"66a9241a-965c-4942-9628-d09795a60b56\") " Oct 07 19:20:43 crc kubenswrapper[4825]: I1007 19:20:43.533696 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66a9241a-965c-4942-9628-d09795a60b56-combined-ca-bundle\") pod \"66a9241a-965c-4942-9628-d09795a60b56\" (UID: \"66a9241a-965c-4942-9628-d09795a60b56\") " Oct 07 19:20:43 crc kubenswrapper[4825]: I1007 19:20:43.538175 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66a9241a-965c-4942-9628-d09795a60b56-logs" (OuterVolumeSpecName: "logs") pod "66a9241a-965c-4942-9628-d09795a60b56" (UID: "66a9241a-965c-4942-9628-d09795a60b56"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:20:43 crc kubenswrapper[4825]: I1007 19:20:43.558352 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66a9241a-965c-4942-9628-d09795a60b56-kube-api-access-x9jbg" (OuterVolumeSpecName: "kube-api-access-x9jbg") pod "66a9241a-965c-4942-9628-d09795a60b56" (UID: "66a9241a-965c-4942-9628-d09795a60b56"). InnerVolumeSpecName "kube-api-access-x9jbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:20:43 crc kubenswrapper[4825]: I1007 19:20:43.565849 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66a9241a-965c-4942-9628-d09795a60b56-config-data" (OuterVolumeSpecName: "config-data") pod "66a9241a-965c-4942-9628-d09795a60b56" (UID: "66a9241a-965c-4942-9628-d09795a60b56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:20:43 crc kubenswrapper[4825]: I1007 19:20:43.567603 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66a9241a-965c-4942-9628-d09795a60b56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66a9241a-965c-4942-9628-d09795a60b56" (UID: "66a9241a-965c-4942-9628-d09795a60b56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:20:43 crc kubenswrapper[4825]: I1007 19:20:43.636081 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9jbg\" (UniqueName: \"kubernetes.io/projected/66a9241a-965c-4942-9628-d09795a60b56-kube-api-access-x9jbg\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:43 crc kubenswrapper[4825]: I1007 19:20:43.636123 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66a9241a-965c-4942-9628-d09795a60b56-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:43 crc kubenswrapper[4825]: I1007 19:20:43.636138 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66a9241a-965c-4942-9628-d09795a60b56-logs\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:43 crc kubenswrapper[4825]: I1007 19:20:43.636152 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66a9241a-965c-4942-9628-d09795a60b56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.307585 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.338463 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.338470 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66a9241a-965c-4942-9628-d09795a60b56","Type":"ContainerDied","Data":"3c1c8101fc70797f3dca8098774354876aa8db82b964cb6ac2d5bf857f1cd311"} Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.339131 4825 scope.go:117] "RemoveContainer" containerID="11c8b8c7c8732710962d56818fd5650636ac9072a20f2e9660e87c042bba016b" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.375837 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.377056 4825 scope.go:117] "RemoveContainer" containerID="5f9b935151b2040f55fb5637402949c132fd30b243dbe2310e8de29f5f51b1b9" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.386238 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.395855 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 19:20:44 crc kubenswrapper[4825]: E1007 19:20:44.396688 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66a9241a-965c-4942-9628-d09795a60b56" containerName="nova-metadata-log" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.396705 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="66a9241a-965c-4942-9628-d09795a60b56" containerName="nova-metadata-log" Oct 07 19:20:44 crc kubenswrapper[4825]: E1007 19:20:44.396747 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66a9241a-965c-4942-9628-d09795a60b56" containerName="nova-metadata-metadata" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.396754 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="66a9241a-965c-4942-9628-d09795a60b56" containerName="nova-metadata-metadata" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.396967 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="66a9241a-965c-4942-9628-d09795a60b56" containerName="nova-metadata-metadata" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.396976 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="66a9241a-965c-4942-9628-d09795a60b56" containerName="nova-metadata-log" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.397925 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.406660 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.407314 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.411261 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.557193 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\") " pod="openstack/nova-metadata-0" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.557321 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-logs\") pod \"nova-metadata-0\" (UID: \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\") " pod="openstack/nova-metadata-0" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.557434 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\") " pod="openstack/nova-metadata-0" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.557889 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvd8d\" (UniqueName: \"kubernetes.io/projected/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-kube-api-access-rvd8d\") pod \"nova-metadata-0\" (UID: \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\") " pod="openstack/nova-metadata-0" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.557968 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-config-data\") pod \"nova-metadata-0\" (UID: \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\") " pod="openstack/nova-metadata-0" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.659766 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvd8d\" (UniqueName: \"kubernetes.io/projected/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-kube-api-access-rvd8d\") pod \"nova-metadata-0\" (UID: \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\") " pod="openstack/nova-metadata-0" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.659850 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-config-data\") pod \"nova-metadata-0\" (UID: \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\") " pod="openstack/nova-metadata-0" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.659954 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\") " pod="openstack/nova-metadata-0" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.660001 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-logs\") pod \"nova-metadata-0\" (UID: \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\") " pod="openstack/nova-metadata-0" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.660061 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\") " pod="openstack/nova-metadata-0" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.660810 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-logs\") pod \"nova-metadata-0\" (UID: \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\") " pod="openstack/nova-metadata-0" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.667314 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-config-data\") pod \"nova-metadata-0\" (UID: \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\") " pod="openstack/nova-metadata-0" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.677075 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\") " pod="openstack/nova-metadata-0" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.678016 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvd8d\" (UniqueName: \"kubernetes.io/projected/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-kube-api-access-rvd8d\") pod \"nova-metadata-0\" (UID: \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\") " pod="openstack/nova-metadata-0" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.678718 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\") " pod="openstack/nova-metadata-0" Oct 07 19:20:44 crc kubenswrapper[4825]: I1007 19:20:44.725475 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 19:20:45 crc kubenswrapper[4825]: I1007 19:20:45.236703 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 19:20:45 crc kubenswrapper[4825]: W1007 19:20:45.244576 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75aebd00_ac39_4a1f_b483_eefa6e2ee0ca.slice/crio-f74227d91c7dd7c5cfa833fa6e36bdd970252ff425a73094da052669b0636171 WatchSource:0}: Error finding container f74227d91c7dd7c5cfa833fa6e36bdd970252ff425a73094da052669b0636171: Status 404 returned error can't find the container with id f74227d91c7dd7c5cfa833fa6e36bdd970252ff425a73094da052669b0636171 Oct 07 19:20:45 crc kubenswrapper[4825]: I1007 19:20:45.363968 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca","Type":"ContainerStarted","Data":"f74227d91c7dd7c5cfa833fa6e36bdd970252ff425a73094da052669b0636171"} Oct 07 19:20:45 crc kubenswrapper[4825]: I1007 19:20:45.807732 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66a9241a-965c-4942-9628-d09795a60b56" path="/var/lib/kubelet/pods/66a9241a-965c-4942-9628-d09795a60b56/volumes" Oct 07 19:20:46 crc kubenswrapper[4825]: I1007 19:20:46.378659 4825 generic.go:334] "Generic (PLEG): container finished" podID="e83137b5-f576-4a66-967b-ccdef3af6897" containerID="4e71f56cbdda4edd6fc279091f95eda084e94d2938c7086ddecacab8a2937970" exitCode=0 Oct 07 19:20:46 crc kubenswrapper[4825]: I1007 19:20:46.378755 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4nvlv" event={"ID":"e83137b5-f576-4a66-967b-ccdef3af6897","Type":"ContainerDied","Data":"4e71f56cbdda4edd6fc279091f95eda084e94d2938c7086ddecacab8a2937970"} Oct 07 19:20:46 crc kubenswrapper[4825]: I1007 19:20:46.381837 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca","Type":"ContainerStarted","Data":"f719b35b6c8bc4a418026ff17dd84587f30736202eebce290536b7047a618483"} Oct 07 19:20:46 crc kubenswrapper[4825]: I1007 19:20:46.381884 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca","Type":"ContainerStarted","Data":"be306ac2eed82cd7f2ac12d9ffd00aa12bfb27bddfcd39071c9b43b058829114"} Oct 07 19:20:46 crc kubenswrapper[4825]: I1007 19:20:46.454003 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.453978944 podStartE2EDuration="2.453978944s" podCreationTimestamp="2025-10-07 19:20:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:20:46.435827456 +0000 UTC m=+1235.257866113" watchObservedRunningTime="2025-10-07 19:20:46.453978944 +0000 UTC m=+1235.276017581" Oct 07 19:20:47 crc kubenswrapper[4825]: I1007 19:20:47.392791 4825 generic.go:334] "Generic (PLEG): container finished" podID="67d7cb41-3e2e-4583-8552-665f52d70bc7" containerID="43deadf38680adb256c4ca4664d6b3186ab3de0e6a2559a1fdad8d7494a8ebdd" exitCode=0 Oct 07 19:20:47 crc kubenswrapper[4825]: I1007 19:20:47.392845 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ntw92" event={"ID":"67d7cb41-3e2e-4583-8552-665f52d70bc7","Type":"ContainerDied","Data":"43deadf38680adb256c4ca4664d6b3186ab3de0e6a2559a1fdad8d7494a8ebdd"} Oct 07 19:20:47 crc kubenswrapper[4825]: I1007 19:20:47.579149 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 19:20:47 crc kubenswrapper[4825]: I1007 19:20:47.579549 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 19:20:47 crc kubenswrapper[4825]: I1007 19:20:47.604640 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 07 19:20:47 crc kubenswrapper[4825]: I1007 19:20:47.644390 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 07 19:20:47 crc kubenswrapper[4825]: I1007 19:20:47.829639 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4nvlv" Oct 07 19:20:47 crc kubenswrapper[4825]: I1007 19:20:47.926550 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83137b5-f576-4a66-967b-ccdef3af6897-combined-ca-bundle\") pod \"e83137b5-f576-4a66-967b-ccdef3af6897\" (UID: \"e83137b5-f576-4a66-967b-ccdef3af6897\") " Oct 07 19:20:47 crc kubenswrapper[4825]: I1007 19:20:47.926598 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83137b5-f576-4a66-967b-ccdef3af6897-config-data\") pod \"e83137b5-f576-4a66-967b-ccdef3af6897\" (UID: \"e83137b5-f576-4a66-967b-ccdef3af6897\") " Oct 07 19:20:47 crc kubenswrapper[4825]: I1007 19:20:47.926681 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99g9b\" (UniqueName: \"kubernetes.io/projected/e83137b5-f576-4a66-967b-ccdef3af6897-kube-api-access-99g9b\") pod \"e83137b5-f576-4a66-967b-ccdef3af6897\" (UID: \"e83137b5-f576-4a66-967b-ccdef3af6897\") " Oct 07 19:20:47 crc kubenswrapper[4825]: I1007 19:20:47.926745 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83137b5-f576-4a66-967b-ccdef3af6897-scripts\") pod \"e83137b5-f576-4a66-967b-ccdef3af6897\" (UID: \"e83137b5-f576-4a66-967b-ccdef3af6897\") " Oct 07 19:20:47 crc kubenswrapper[4825]: I1007 19:20:47.937424 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83137b5-f576-4a66-967b-ccdef3af6897-scripts" (OuterVolumeSpecName: "scripts") pod "e83137b5-f576-4a66-967b-ccdef3af6897" (UID: "e83137b5-f576-4a66-967b-ccdef3af6897"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:20:47 crc kubenswrapper[4825]: I1007 19:20:47.937456 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e83137b5-f576-4a66-967b-ccdef3af6897-kube-api-access-99g9b" (OuterVolumeSpecName: "kube-api-access-99g9b") pod "e83137b5-f576-4a66-967b-ccdef3af6897" (UID: "e83137b5-f576-4a66-967b-ccdef3af6897"). InnerVolumeSpecName "kube-api-access-99g9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:20:47 crc kubenswrapper[4825]: I1007 19:20:47.970696 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83137b5-f576-4a66-967b-ccdef3af6897-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e83137b5-f576-4a66-967b-ccdef3af6897" (UID: "e83137b5-f576-4a66-967b-ccdef3af6897"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:20:47 crc kubenswrapper[4825]: I1007 19:20:47.982559 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83137b5-f576-4a66-967b-ccdef3af6897-config-data" (OuterVolumeSpecName: "config-data") pod "e83137b5-f576-4a66-967b-ccdef3af6897" (UID: "e83137b5-f576-4a66-967b-ccdef3af6897"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.010413 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.043703 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83137b5-f576-4a66-967b-ccdef3af6897-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.043748 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83137b5-f576-4a66-967b-ccdef3af6897-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.043760 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99g9b\" (UniqueName: \"kubernetes.io/projected/e83137b5-f576-4a66-967b-ccdef3af6897-kube-api-access-99g9b\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.043773 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83137b5-f576-4a66-967b-ccdef3af6897-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.091744 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-z8gvh"] Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.092036 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" podUID="a011ec14-9f09-4d4a-98e3-607190afaaa9" containerName="dnsmasq-dns" containerID="cri-o://5b5584025e6a7261245126fa84d003b6cd1b5d53330331c728dbcf2b937387a3" gracePeriod=10 Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.405197 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4nvlv" event={"ID":"e83137b5-f576-4a66-967b-ccdef3af6897","Type":"ContainerDied","Data":"d5d046fa084a14bded94246a90eaa08cc1953e53e4fd62ca1f902642e8749c56"} Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.405499 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5d046fa084a14bded94246a90eaa08cc1953e53e4fd62ca1f902642e8749c56" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.405555 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4nvlv" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.413609 4825 generic.go:334] "Generic (PLEG): container finished" podID="a011ec14-9f09-4d4a-98e3-607190afaaa9" containerID="5b5584025e6a7261245126fa84d003b6cd1b5d53330331c728dbcf2b937387a3" exitCode=0 Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.413808 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" event={"ID":"a011ec14-9f09-4d4a-98e3-607190afaaa9","Type":"ContainerDied","Data":"5b5584025e6a7261245126fa84d003b6cd1b5d53330331c728dbcf2b937387a3"} Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.465698 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.496926 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.564733 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-dns-svc\") pod \"a011ec14-9f09-4d4a-98e3-607190afaaa9\" (UID: \"a011ec14-9f09-4d4a-98e3-607190afaaa9\") " Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.565061 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-ovsdbserver-sb\") pod \"a011ec14-9f09-4d4a-98e3-607190afaaa9\" (UID: \"a011ec14-9f09-4d4a-98e3-607190afaaa9\") " Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.565557 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-config\") pod \"a011ec14-9f09-4d4a-98e3-607190afaaa9\" (UID: \"a011ec14-9f09-4d4a-98e3-607190afaaa9\") " Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.565606 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-ovsdbserver-nb\") pod \"a011ec14-9f09-4d4a-98e3-607190afaaa9\" (UID: \"a011ec14-9f09-4d4a-98e3-607190afaaa9\") " Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.565627 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-dns-swift-storage-0\") pod \"a011ec14-9f09-4d4a-98e3-607190afaaa9\" (UID: \"a011ec14-9f09-4d4a-98e3-607190afaaa9\") " Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.565686 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfqlc\" (UniqueName: \"kubernetes.io/projected/a011ec14-9f09-4d4a-98e3-607190afaaa9-kube-api-access-pfqlc\") pod \"a011ec14-9f09-4d4a-98e3-607190afaaa9\" (UID: \"a011ec14-9f09-4d4a-98e3-607190afaaa9\") " Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.572157 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a011ec14-9f09-4d4a-98e3-607190afaaa9-kube-api-access-pfqlc" (OuterVolumeSpecName: "kube-api-access-pfqlc") pod "a011ec14-9f09-4d4a-98e3-607190afaaa9" (UID: "a011ec14-9f09-4d4a-98e3-607190afaaa9"). InnerVolumeSpecName "kube-api-access-pfqlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.615698 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.624361 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e3270262-7d96-4271-8be1-c01393ae7bc4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.624635 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e3270262-7d96-4271-8be1-c01393ae7bc4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.633756 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a011ec14-9f09-4d4a-98e3-607190afaaa9" (UID: "a011ec14-9f09-4d4a-98e3-607190afaaa9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.637027 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-config" (OuterVolumeSpecName: "config") pod "a011ec14-9f09-4d4a-98e3-607190afaaa9" (UID: "a011ec14-9f09-4d4a-98e3-607190afaaa9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.637110 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a011ec14-9f09-4d4a-98e3-607190afaaa9" (UID: "a011ec14-9f09-4d4a-98e3-607190afaaa9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.666899 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.666934 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfqlc\" (UniqueName: \"kubernetes.io/projected/a011ec14-9f09-4d4a-98e3-607190afaaa9-kube-api-access-pfqlc\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.666946 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.666956 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.675125 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.675345 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="75aebd00-ac39-4a1f-b483-eefa6e2ee0ca" containerName="nova-metadata-log" containerID="cri-o://be306ac2eed82cd7f2ac12d9ffd00aa12bfb27bddfcd39071c9b43b058829114" gracePeriod=30 Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.675737 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="75aebd00-ac39-4a1f-b483-eefa6e2ee0ca" containerName="nova-metadata-metadata" containerID="cri-o://f719b35b6c8bc4a418026ff17dd84587f30736202eebce290536b7047a618483" gracePeriod=30 Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.677974 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a011ec14-9f09-4d4a-98e3-607190afaaa9" (UID: "a011ec14-9f09-4d4a-98e3-607190afaaa9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.723576 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a011ec14-9f09-4d4a-98e3-607190afaaa9" (UID: "a011ec14-9f09-4d4a-98e3-607190afaaa9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.768902 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.768936 4825 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a011ec14-9f09-4d4a-98e3-607190afaaa9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.830105 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ntw92" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.870029 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67d7cb41-3e2e-4583-8552-665f52d70bc7-scripts\") pod \"67d7cb41-3e2e-4583-8552-665f52d70bc7\" (UID: \"67d7cb41-3e2e-4583-8552-665f52d70bc7\") " Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.870122 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d7cb41-3e2e-4583-8552-665f52d70bc7-config-data\") pod \"67d7cb41-3e2e-4583-8552-665f52d70bc7\" (UID: \"67d7cb41-3e2e-4583-8552-665f52d70bc7\") " Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.870178 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d7cb41-3e2e-4583-8552-665f52d70bc7-combined-ca-bundle\") pod \"67d7cb41-3e2e-4583-8552-665f52d70bc7\" (UID: \"67d7cb41-3e2e-4583-8552-665f52d70bc7\") " Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.870351 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rlk2\" (UniqueName: \"kubernetes.io/projected/67d7cb41-3e2e-4583-8552-665f52d70bc7-kube-api-access-8rlk2\") pod \"67d7cb41-3e2e-4583-8552-665f52d70bc7\" (UID: \"67d7cb41-3e2e-4583-8552-665f52d70bc7\") " Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.873852 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d7cb41-3e2e-4583-8552-665f52d70bc7-scripts" (OuterVolumeSpecName: "scripts") pod "67d7cb41-3e2e-4583-8552-665f52d70bc7" (UID: "67d7cb41-3e2e-4583-8552-665f52d70bc7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.880487 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67d7cb41-3e2e-4583-8552-665f52d70bc7-kube-api-access-8rlk2" (OuterVolumeSpecName: "kube-api-access-8rlk2") pod "67d7cb41-3e2e-4583-8552-665f52d70bc7" (UID: "67d7cb41-3e2e-4583-8552-665f52d70bc7"). InnerVolumeSpecName "kube-api-access-8rlk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.903133 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d7cb41-3e2e-4583-8552-665f52d70bc7-config-data" (OuterVolumeSpecName: "config-data") pod "67d7cb41-3e2e-4583-8552-665f52d70bc7" (UID: "67d7cb41-3e2e-4583-8552-665f52d70bc7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.909351 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d7cb41-3e2e-4583-8552-665f52d70bc7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67d7cb41-3e2e-4583-8552-665f52d70bc7" (UID: "67d7cb41-3e2e-4583-8552-665f52d70bc7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.968975 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.972627 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67d7cb41-3e2e-4583-8552-665f52d70bc7-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.972654 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d7cb41-3e2e-4583-8552-665f52d70bc7-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.972666 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d7cb41-3e2e-4583-8552-665f52d70bc7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:48 crc kubenswrapper[4825]: I1007 19:20:48.972680 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rlk2\" (UniqueName: \"kubernetes.io/projected/67d7cb41-3e2e-4583-8552-665f52d70bc7-kube-api-access-8rlk2\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.163027 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.186154 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-logs\") pod \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\" (UID: \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\") " Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.186819 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-config-data\") pod \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\" (UID: \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\") " Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.187051 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-nova-metadata-tls-certs\") pod \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\" (UID: \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\") " Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.187047 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-logs" (OuterVolumeSpecName: "logs") pod "75aebd00-ac39-4a1f-b483-eefa6e2ee0ca" (UID: "75aebd00-ac39-4a1f-b483-eefa6e2ee0ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.187123 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-combined-ca-bundle\") pod \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\" (UID: \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\") " Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.187221 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvd8d\" (UniqueName: \"kubernetes.io/projected/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-kube-api-access-rvd8d\") pod \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\" (UID: \"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca\") " Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.187932 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-logs\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.194995 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-kube-api-access-rvd8d" (OuterVolumeSpecName: "kube-api-access-rvd8d") pod "75aebd00-ac39-4a1f-b483-eefa6e2ee0ca" (UID: "75aebd00-ac39-4a1f-b483-eefa6e2ee0ca"). InnerVolumeSpecName "kube-api-access-rvd8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.216371 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-config-data" (OuterVolumeSpecName: "config-data") pod "75aebd00-ac39-4a1f-b483-eefa6e2ee0ca" (UID: "75aebd00-ac39-4a1f-b483-eefa6e2ee0ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.222477 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75aebd00-ac39-4a1f-b483-eefa6e2ee0ca" (UID: "75aebd00-ac39-4a1f-b483-eefa6e2ee0ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.242741 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "75aebd00-ac39-4a1f-b483-eefa6e2ee0ca" (UID: "75aebd00-ac39-4a1f-b483-eefa6e2ee0ca"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.289947 4825 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.289979 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.289991 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvd8d\" (UniqueName: \"kubernetes.io/projected/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-kube-api-access-rvd8d\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.290000 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.423429 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" event={"ID":"a011ec14-9f09-4d4a-98e3-607190afaaa9","Type":"ContainerDied","Data":"3d698efb1ed358e624bc95353be0ec6591182d23fb4f58ef04cde08e24373b63"} Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.423499 4825 scope.go:117] "RemoveContainer" containerID="5b5584025e6a7261245126fa84d003b6cd1b5d53330331c728dbcf2b937387a3" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.423680 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-z8gvh" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.430365 4825 generic.go:334] "Generic (PLEG): container finished" podID="75aebd00-ac39-4a1f-b483-eefa6e2ee0ca" containerID="f719b35b6c8bc4a418026ff17dd84587f30736202eebce290536b7047a618483" exitCode=0 Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.430403 4825 generic.go:334] "Generic (PLEG): container finished" podID="75aebd00-ac39-4a1f-b483-eefa6e2ee0ca" containerID="be306ac2eed82cd7f2ac12d9ffd00aa12bfb27bddfcd39071c9b43b058829114" exitCode=143 Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.430460 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca","Type":"ContainerDied","Data":"f719b35b6c8bc4a418026ff17dd84587f30736202eebce290536b7047a618483"} Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.430497 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca","Type":"ContainerDied","Data":"be306ac2eed82cd7f2ac12d9ffd00aa12bfb27bddfcd39071c9b43b058829114"} Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.430515 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"75aebd00-ac39-4a1f-b483-eefa6e2ee0ca","Type":"ContainerDied","Data":"f74227d91c7dd7c5cfa833fa6e36bdd970252ff425a73094da052669b0636171"} Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.430585 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.435164 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e3270262-7d96-4271-8be1-c01393ae7bc4" containerName="nova-api-log" containerID="cri-o://cb7af0619ecda6bd3774bf7b7e15f5fb0ea03c7195bbf641571c746825be6513" gracePeriod=30 Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.435449 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ntw92" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.439293 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ntw92" event={"ID":"67d7cb41-3e2e-4583-8552-665f52d70bc7","Type":"ContainerDied","Data":"4f62b6555432aee077b3ca596b19e211efd191968ceaa239e97788081926a09f"} Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.439460 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f62b6555432aee077b3ca596b19e211efd191968ceaa239e97788081926a09f" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.444213 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e3270262-7d96-4271-8be1-c01393ae7bc4" containerName="nova-api-api" containerID="cri-o://83cb41800a302d9b3d62768b0927e59bf89153bdb24a284c364aa4a3bbc2f35d" gracePeriod=30 Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.460139 4825 scope.go:117] "RemoveContainer" containerID="f01a6722eccc007ce9f3c8261a662597e8500d2dc54a9e24a94e83a77b90a126" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.492695 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 19:20:49 crc kubenswrapper[4825]: E1007 19:20:49.518593 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67d7cb41-3e2e-4583-8552-665f52d70bc7" containerName="nova-cell1-conductor-db-sync" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.518651 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="67d7cb41-3e2e-4583-8552-665f52d70bc7" containerName="nova-cell1-conductor-db-sync" Oct 07 19:20:49 crc kubenswrapper[4825]: E1007 19:20:49.518729 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e83137b5-f576-4a66-967b-ccdef3af6897" containerName="nova-manage" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.518740 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e83137b5-f576-4a66-967b-ccdef3af6897" containerName="nova-manage" Oct 07 19:20:49 crc kubenswrapper[4825]: E1007 19:20:49.518769 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75aebd00-ac39-4a1f-b483-eefa6e2ee0ca" containerName="nova-metadata-metadata" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.518781 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="75aebd00-ac39-4a1f-b483-eefa6e2ee0ca" containerName="nova-metadata-metadata" Oct 07 19:20:49 crc kubenswrapper[4825]: E1007 19:20:49.518818 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a011ec14-9f09-4d4a-98e3-607190afaaa9" containerName="init" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.518826 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a011ec14-9f09-4d4a-98e3-607190afaaa9" containerName="init" Oct 07 19:20:49 crc kubenswrapper[4825]: E1007 19:20:49.518850 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a011ec14-9f09-4d4a-98e3-607190afaaa9" containerName="dnsmasq-dns" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.518858 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a011ec14-9f09-4d4a-98e3-607190afaaa9" containerName="dnsmasq-dns" Oct 07 19:20:49 crc kubenswrapper[4825]: E1007 19:20:49.518894 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75aebd00-ac39-4a1f-b483-eefa6e2ee0ca" containerName="nova-metadata-log" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.518904 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="75aebd00-ac39-4a1f-b483-eefa6e2ee0ca" containerName="nova-metadata-log" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.519412 4825 scope.go:117] "RemoveContainer" containerID="f719b35b6c8bc4a418026ff17dd84587f30736202eebce290536b7047a618483" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.519724 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="75aebd00-ac39-4a1f-b483-eefa6e2ee0ca" containerName="nova-metadata-metadata" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.519781 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e83137b5-f576-4a66-967b-ccdef3af6897" containerName="nova-manage" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.519802 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="67d7cb41-3e2e-4583-8552-665f52d70bc7" containerName="nova-cell1-conductor-db-sync" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.519828 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a011ec14-9f09-4d4a-98e3-607190afaaa9" containerName="dnsmasq-dns" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.519857 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="75aebd00-ac39-4a1f-b483-eefa6e2ee0ca" containerName="nova-metadata-log" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.521160 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.521302 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.525382 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.536489 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-z8gvh"] Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.564480 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-z8gvh"] Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.579301 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.590519 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.603298 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.604644 4825 scope.go:117] "RemoveContainer" containerID="be306ac2eed82cd7f2ac12d9ffd00aa12bfb27bddfcd39071c9b43b058829114" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.604856 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.608582 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.608731 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.618329 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch4r9\" (UniqueName: \"kubernetes.io/projected/df2bca65-1a0f-4e1c-ba16-5b18bb7e71b7-kube-api-access-ch4r9\") pod \"nova-cell1-conductor-0\" (UID: \"df2bca65-1a0f-4e1c-ba16-5b18bb7e71b7\") " pod="openstack/nova-cell1-conductor-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.618433 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df2bca65-1a0f-4e1c-ba16-5b18bb7e71b7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"df2bca65-1a0f-4e1c-ba16-5b18bb7e71b7\") " pod="openstack/nova-cell1-conductor-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.618491 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df2bca65-1a0f-4e1c-ba16-5b18bb7e71b7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"df2bca65-1a0f-4e1c-ba16-5b18bb7e71b7\") " pod="openstack/nova-cell1-conductor-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.628161 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.637713 4825 scope.go:117] "RemoveContainer" containerID="f719b35b6c8bc4a418026ff17dd84587f30736202eebce290536b7047a618483" Oct 07 19:20:49 crc kubenswrapper[4825]: E1007 19:20:49.638159 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f719b35b6c8bc4a418026ff17dd84587f30736202eebce290536b7047a618483\": container with ID starting with f719b35b6c8bc4a418026ff17dd84587f30736202eebce290536b7047a618483 not found: ID does not exist" containerID="f719b35b6c8bc4a418026ff17dd84587f30736202eebce290536b7047a618483" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.638252 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f719b35b6c8bc4a418026ff17dd84587f30736202eebce290536b7047a618483"} err="failed to get container status \"f719b35b6c8bc4a418026ff17dd84587f30736202eebce290536b7047a618483\": rpc error: code = NotFound desc = could not find container \"f719b35b6c8bc4a418026ff17dd84587f30736202eebce290536b7047a618483\": container with ID starting with f719b35b6c8bc4a418026ff17dd84587f30736202eebce290536b7047a618483 not found: ID does not exist" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.638282 4825 scope.go:117] "RemoveContainer" containerID="be306ac2eed82cd7f2ac12d9ffd00aa12bfb27bddfcd39071c9b43b058829114" Oct 07 19:20:49 crc kubenswrapper[4825]: E1007 19:20:49.638666 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be306ac2eed82cd7f2ac12d9ffd00aa12bfb27bddfcd39071c9b43b058829114\": container with ID starting with be306ac2eed82cd7f2ac12d9ffd00aa12bfb27bddfcd39071c9b43b058829114 not found: ID does not exist" containerID="be306ac2eed82cd7f2ac12d9ffd00aa12bfb27bddfcd39071c9b43b058829114" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.638699 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be306ac2eed82cd7f2ac12d9ffd00aa12bfb27bddfcd39071c9b43b058829114"} err="failed to get container status \"be306ac2eed82cd7f2ac12d9ffd00aa12bfb27bddfcd39071c9b43b058829114\": rpc error: code = NotFound desc = could not find container \"be306ac2eed82cd7f2ac12d9ffd00aa12bfb27bddfcd39071c9b43b058829114\": container with ID starting with be306ac2eed82cd7f2ac12d9ffd00aa12bfb27bddfcd39071c9b43b058829114 not found: ID does not exist" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.638803 4825 scope.go:117] "RemoveContainer" containerID="f719b35b6c8bc4a418026ff17dd84587f30736202eebce290536b7047a618483" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.639067 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f719b35b6c8bc4a418026ff17dd84587f30736202eebce290536b7047a618483"} err="failed to get container status \"f719b35b6c8bc4a418026ff17dd84587f30736202eebce290536b7047a618483\": rpc error: code = NotFound desc = could not find container \"f719b35b6c8bc4a418026ff17dd84587f30736202eebce290536b7047a618483\": container with ID starting with f719b35b6c8bc4a418026ff17dd84587f30736202eebce290536b7047a618483 not found: ID does not exist" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.639111 4825 scope.go:117] "RemoveContainer" containerID="be306ac2eed82cd7f2ac12d9ffd00aa12bfb27bddfcd39071c9b43b058829114" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.639357 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be306ac2eed82cd7f2ac12d9ffd00aa12bfb27bddfcd39071c9b43b058829114"} err="failed to get container status \"be306ac2eed82cd7f2ac12d9ffd00aa12bfb27bddfcd39071c9b43b058829114\": rpc error: code = NotFound desc = could not find container \"be306ac2eed82cd7f2ac12d9ffd00aa12bfb27bddfcd39071c9b43b058829114\": container with ID starting with be306ac2eed82cd7f2ac12d9ffd00aa12bfb27bddfcd39071c9b43b058829114 not found: ID does not exist" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.720492 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-config-data\") pod \"nova-metadata-0\" (UID: \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\") " pod="openstack/nova-metadata-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.720536 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-logs\") pod \"nova-metadata-0\" (UID: \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\") " pod="openstack/nova-metadata-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.720573 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\") " pod="openstack/nova-metadata-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.720638 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch4r9\" (UniqueName: \"kubernetes.io/projected/df2bca65-1a0f-4e1c-ba16-5b18bb7e71b7-kube-api-access-ch4r9\") pod \"nova-cell1-conductor-0\" (UID: \"df2bca65-1a0f-4e1c-ba16-5b18bb7e71b7\") " pod="openstack/nova-cell1-conductor-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.720832 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df2bca65-1a0f-4e1c-ba16-5b18bb7e71b7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"df2bca65-1a0f-4e1c-ba16-5b18bb7e71b7\") " pod="openstack/nova-cell1-conductor-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.721030 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df2bca65-1a0f-4e1c-ba16-5b18bb7e71b7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"df2bca65-1a0f-4e1c-ba16-5b18bb7e71b7\") " pod="openstack/nova-cell1-conductor-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.721162 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\") " pod="openstack/nova-metadata-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.721263 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md5n5\" (UniqueName: \"kubernetes.io/projected/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-kube-api-access-md5n5\") pod \"nova-metadata-0\" (UID: \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\") " pod="openstack/nova-metadata-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.725557 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df2bca65-1a0f-4e1c-ba16-5b18bb7e71b7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"df2bca65-1a0f-4e1c-ba16-5b18bb7e71b7\") " pod="openstack/nova-cell1-conductor-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.725703 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df2bca65-1a0f-4e1c-ba16-5b18bb7e71b7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"df2bca65-1a0f-4e1c-ba16-5b18bb7e71b7\") " pod="openstack/nova-cell1-conductor-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.739854 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch4r9\" (UniqueName: \"kubernetes.io/projected/df2bca65-1a0f-4e1c-ba16-5b18bb7e71b7-kube-api-access-ch4r9\") pod \"nova-cell1-conductor-0\" (UID: \"df2bca65-1a0f-4e1c-ba16-5b18bb7e71b7\") " pod="openstack/nova-cell1-conductor-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.805317 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75aebd00-ac39-4a1f-b483-eefa6e2ee0ca" path="/var/lib/kubelet/pods/75aebd00-ac39-4a1f-b483-eefa6e2ee0ca/volumes" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.805898 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a011ec14-9f09-4d4a-98e3-607190afaaa9" path="/var/lib/kubelet/pods/a011ec14-9f09-4d4a-98e3-607190afaaa9/volumes" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.823117 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\") " pod="openstack/nova-metadata-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.823443 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\") " pod="openstack/nova-metadata-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.823541 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md5n5\" (UniqueName: \"kubernetes.io/projected/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-kube-api-access-md5n5\") pod \"nova-metadata-0\" (UID: \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\") " pod="openstack/nova-metadata-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.823674 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-config-data\") pod \"nova-metadata-0\" (UID: \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\") " pod="openstack/nova-metadata-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.823773 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-logs\") pod \"nova-metadata-0\" (UID: \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\") " pod="openstack/nova-metadata-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.824248 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-logs\") pod \"nova-metadata-0\" (UID: \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\") " pod="openstack/nova-metadata-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.826939 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\") " pod="openstack/nova-metadata-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.827623 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-config-data\") pod \"nova-metadata-0\" (UID: \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\") " pod="openstack/nova-metadata-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.827983 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\") " pod="openstack/nova-metadata-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.852670 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md5n5\" (UniqueName: \"kubernetes.io/projected/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-kube-api-access-md5n5\") pod \"nova-metadata-0\" (UID: \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\") " pod="openstack/nova-metadata-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.870771 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 19:20:49 crc kubenswrapper[4825]: I1007 19:20:49.935940 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 19:20:50 crc kubenswrapper[4825]: I1007 19:20:50.371391 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 19:20:50 crc kubenswrapper[4825]: W1007 19:20:50.372552 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf2bca65_1a0f_4e1c_ba16_5b18bb7e71b7.slice/crio-8e1786fafeb9a43845fe2511d1bbecddee238aa31eee1bd093461979375e529a WatchSource:0}: Error finding container 8e1786fafeb9a43845fe2511d1bbecddee238aa31eee1bd093461979375e529a: Status 404 returned error can't find the container with id 8e1786fafeb9a43845fe2511d1bbecddee238aa31eee1bd093461979375e529a Oct 07 19:20:50 crc kubenswrapper[4825]: I1007 19:20:50.441160 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 19:20:50 crc kubenswrapper[4825]: I1007 19:20:50.450259 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"df2bca65-1a0f-4e1c-ba16-5b18bb7e71b7","Type":"ContainerStarted","Data":"8e1786fafeb9a43845fe2511d1bbecddee238aa31eee1bd093461979375e529a"} Oct 07 19:20:50 crc kubenswrapper[4825]: W1007 19:20:50.453253 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2881a5f6_3d20_4bbb_a82e_9aafb0076ca7.slice/crio-3a8046dd381ce5afe7984f2e27565726032c55ce6b0b8a1b0983051a17cc4039 WatchSource:0}: Error finding container 3a8046dd381ce5afe7984f2e27565726032c55ce6b0b8a1b0983051a17cc4039: Status 404 returned error can't find the container with id 3a8046dd381ce5afe7984f2e27565726032c55ce6b0b8a1b0983051a17cc4039 Oct 07 19:20:50 crc kubenswrapper[4825]: I1007 19:20:50.455489 4825 generic.go:334] "Generic (PLEG): container finished" podID="e3270262-7d96-4271-8be1-c01393ae7bc4" containerID="cb7af0619ecda6bd3774bf7b7e15f5fb0ea03c7195bbf641571c746825be6513" exitCode=143 Oct 07 19:20:50 crc kubenswrapper[4825]: I1007 19:20:50.455726 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a92ef064-089b-4c42-b0b4-dcd9c54d75a3" containerName="nova-scheduler-scheduler" containerID="cri-o://966133ceb48dc55ea1124a42f971ca7bf564ecadfed153496ac2559fed45aee7" gracePeriod=30 Oct 07 19:20:50 crc kubenswrapper[4825]: I1007 19:20:50.456013 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3270262-7d96-4271-8be1-c01393ae7bc4","Type":"ContainerDied","Data":"cb7af0619ecda6bd3774bf7b7e15f5fb0ea03c7195bbf641571c746825be6513"} Oct 07 19:20:51 crc kubenswrapper[4825]: I1007 19:20:51.475511 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"df2bca65-1a0f-4e1c-ba16-5b18bb7e71b7","Type":"ContainerStarted","Data":"e0e70ce7efd86dde133c4dff3b1325981cf26a72205e58196bcf2e5f6086cf19"} Oct 07 19:20:51 crc kubenswrapper[4825]: I1007 19:20:51.475853 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 07 19:20:51 crc kubenswrapper[4825]: I1007 19:20:51.480042 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7","Type":"ContainerStarted","Data":"90a8fe270a6c97ce5f74e190b3c8952e0ff16e8d08a36161cc244e0eaac291e1"} Oct 07 19:20:51 crc kubenswrapper[4825]: I1007 19:20:51.480119 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7","Type":"ContainerStarted","Data":"d815f4609e6e8f8ee9248997b1e4db698381be99aab6d1c13142990c254dddcc"} Oct 07 19:20:51 crc kubenswrapper[4825]: I1007 19:20:51.480146 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7","Type":"ContainerStarted","Data":"3a8046dd381ce5afe7984f2e27565726032c55ce6b0b8a1b0983051a17cc4039"} Oct 07 19:20:51 crc kubenswrapper[4825]: I1007 19:20:51.512362 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.51233791 podStartE2EDuration="2.51233791s" podCreationTimestamp="2025-10-07 19:20:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:20:51.497437922 +0000 UTC m=+1240.319476569" watchObservedRunningTime="2025-10-07 19:20:51.51233791 +0000 UTC m=+1240.334376557" Oct 07 19:20:51 crc kubenswrapper[4825]: I1007 19:20:51.532084 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.532064478 podStartE2EDuration="2.532064478s" podCreationTimestamp="2025-10-07 19:20:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:20:51.51744991 +0000 UTC m=+1240.339488627" watchObservedRunningTime="2025-10-07 19:20:51.532064478 +0000 UTC m=+1240.354103125" Oct 07 19:20:52 crc kubenswrapper[4825]: E1007 19:20:52.606214 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="966133ceb48dc55ea1124a42f971ca7bf564ecadfed153496ac2559fed45aee7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 19:20:52 crc kubenswrapper[4825]: E1007 19:20:52.608559 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="966133ceb48dc55ea1124a42f971ca7bf564ecadfed153496ac2559fed45aee7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 19:20:52 crc kubenswrapper[4825]: E1007 19:20:52.609972 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="966133ceb48dc55ea1124a42f971ca7bf564ecadfed153496ac2559fed45aee7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 19:20:52 crc kubenswrapper[4825]: E1007 19:20:52.610032 4825 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="a92ef064-089b-4c42-b0b4-dcd9c54d75a3" containerName="nova-scheduler-scheduler" Oct 07 19:20:53 crc kubenswrapper[4825]: I1007 19:20:53.499591 4825 generic.go:334] "Generic (PLEG): container finished" podID="a92ef064-089b-4c42-b0b4-dcd9c54d75a3" containerID="966133ceb48dc55ea1124a42f971ca7bf564ecadfed153496ac2559fed45aee7" exitCode=0 Oct 07 19:20:53 crc kubenswrapper[4825]: I1007 19:20:53.499915 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a92ef064-089b-4c42-b0b4-dcd9c54d75a3","Type":"ContainerDied","Data":"966133ceb48dc55ea1124a42f971ca7bf564ecadfed153496ac2559fed45aee7"} Oct 07 19:20:53 crc kubenswrapper[4825]: I1007 19:20:53.499946 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a92ef064-089b-4c42-b0b4-dcd9c54d75a3","Type":"ContainerDied","Data":"0aaba32f719bfacfe0a09efa5241286a96bba9858d1ff91ece2cd2a464766c34"} Oct 07 19:20:53 crc kubenswrapper[4825]: I1007 19:20:53.499959 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0aaba32f719bfacfe0a09efa5241286a96bba9858d1ff91ece2cd2a464766c34" Oct 07 19:20:53 crc kubenswrapper[4825]: I1007 19:20:53.504566 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 19:20:53 crc kubenswrapper[4825]: I1007 19:20:53.595557 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92ef064-089b-4c42-b0b4-dcd9c54d75a3-combined-ca-bundle\") pod \"a92ef064-089b-4c42-b0b4-dcd9c54d75a3\" (UID: \"a92ef064-089b-4c42-b0b4-dcd9c54d75a3\") " Oct 07 19:20:53 crc kubenswrapper[4825]: I1007 19:20:53.595669 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92ef064-089b-4c42-b0b4-dcd9c54d75a3-config-data\") pod \"a92ef064-089b-4c42-b0b4-dcd9c54d75a3\" (UID: \"a92ef064-089b-4c42-b0b4-dcd9c54d75a3\") " Oct 07 19:20:53 crc kubenswrapper[4825]: I1007 19:20:53.595731 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29hkl\" (UniqueName: \"kubernetes.io/projected/a92ef064-089b-4c42-b0b4-dcd9c54d75a3-kube-api-access-29hkl\") pod \"a92ef064-089b-4c42-b0b4-dcd9c54d75a3\" (UID: \"a92ef064-089b-4c42-b0b4-dcd9c54d75a3\") " Oct 07 19:20:53 crc kubenswrapper[4825]: I1007 19:20:53.601724 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a92ef064-089b-4c42-b0b4-dcd9c54d75a3-kube-api-access-29hkl" (OuterVolumeSpecName: "kube-api-access-29hkl") pod "a92ef064-089b-4c42-b0b4-dcd9c54d75a3" (UID: "a92ef064-089b-4c42-b0b4-dcd9c54d75a3"). InnerVolumeSpecName "kube-api-access-29hkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:20:53 crc kubenswrapper[4825]: I1007 19:20:53.632424 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a92ef064-089b-4c42-b0b4-dcd9c54d75a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a92ef064-089b-4c42-b0b4-dcd9c54d75a3" (UID: "a92ef064-089b-4c42-b0b4-dcd9c54d75a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:20:53 crc kubenswrapper[4825]: I1007 19:20:53.634723 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a92ef064-089b-4c42-b0b4-dcd9c54d75a3-config-data" (OuterVolumeSpecName: "config-data") pod "a92ef064-089b-4c42-b0b4-dcd9c54d75a3" (UID: "a92ef064-089b-4c42-b0b4-dcd9c54d75a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:20:53 crc kubenswrapper[4825]: I1007 19:20:53.698685 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92ef064-089b-4c42-b0b4-dcd9c54d75a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:53 crc kubenswrapper[4825]: I1007 19:20:53.699008 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92ef064-089b-4c42-b0b4-dcd9c54d75a3-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:53 crc kubenswrapper[4825]: I1007 19:20:53.699112 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29hkl\" (UniqueName: \"kubernetes.io/projected/a92ef064-089b-4c42-b0b4-dcd9c54d75a3-kube-api-access-29hkl\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.381902 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.413493 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3270262-7d96-4271-8be1-c01393ae7bc4-config-data\") pod \"e3270262-7d96-4271-8be1-c01393ae7bc4\" (UID: \"e3270262-7d96-4271-8be1-c01393ae7bc4\") " Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.413556 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3270262-7d96-4271-8be1-c01393ae7bc4-combined-ca-bundle\") pod \"e3270262-7d96-4271-8be1-c01393ae7bc4\" (UID: \"e3270262-7d96-4271-8be1-c01393ae7bc4\") " Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.413596 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptxzf\" (UniqueName: \"kubernetes.io/projected/e3270262-7d96-4271-8be1-c01393ae7bc4-kube-api-access-ptxzf\") pod \"e3270262-7d96-4271-8be1-c01393ae7bc4\" (UID: \"e3270262-7d96-4271-8be1-c01393ae7bc4\") " Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.413788 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3270262-7d96-4271-8be1-c01393ae7bc4-logs\") pod \"e3270262-7d96-4271-8be1-c01393ae7bc4\" (UID: \"e3270262-7d96-4271-8be1-c01393ae7bc4\") " Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.414693 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3270262-7d96-4271-8be1-c01393ae7bc4-logs" (OuterVolumeSpecName: "logs") pod "e3270262-7d96-4271-8be1-c01393ae7bc4" (UID: "e3270262-7d96-4271-8be1-c01393ae7bc4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.457929 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3270262-7d96-4271-8be1-c01393ae7bc4-kube-api-access-ptxzf" (OuterVolumeSpecName: "kube-api-access-ptxzf") pod "e3270262-7d96-4271-8be1-c01393ae7bc4" (UID: "e3270262-7d96-4271-8be1-c01393ae7bc4"). InnerVolumeSpecName "kube-api-access-ptxzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.459592 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3270262-7d96-4271-8be1-c01393ae7bc4-config-data" (OuterVolumeSpecName: "config-data") pod "e3270262-7d96-4271-8be1-c01393ae7bc4" (UID: "e3270262-7d96-4271-8be1-c01393ae7bc4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.478952 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3270262-7d96-4271-8be1-c01393ae7bc4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3270262-7d96-4271-8be1-c01393ae7bc4" (UID: "e3270262-7d96-4271-8be1-c01393ae7bc4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.513989 4825 generic.go:334] "Generic (PLEG): container finished" podID="e3270262-7d96-4271-8be1-c01393ae7bc4" containerID="83cb41800a302d9b3d62768b0927e59bf89153bdb24a284c364aa4a3bbc2f35d" exitCode=0 Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.514058 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.514087 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3270262-7d96-4271-8be1-c01393ae7bc4","Type":"ContainerDied","Data":"83cb41800a302d9b3d62768b0927e59bf89153bdb24a284c364aa4a3bbc2f35d"} Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.514137 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3270262-7d96-4271-8be1-c01393ae7bc4","Type":"ContainerDied","Data":"608d889bf635c388ae60cd35da8633b56e012804638e08a91c8791bcea6b96db"} Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.514072 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.514178 4825 scope.go:117] "RemoveContainer" containerID="83cb41800a302d9b3d62768b0927e59bf89153bdb24a284c364aa4a3bbc2f35d" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.515276 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3270262-7d96-4271-8be1-c01393ae7bc4-logs\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.515303 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3270262-7d96-4271-8be1-c01393ae7bc4-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.515315 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3270262-7d96-4271-8be1-c01393ae7bc4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.515325 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptxzf\" (UniqueName: \"kubernetes.io/projected/e3270262-7d96-4271-8be1-c01393ae7bc4-kube-api-access-ptxzf\") on node \"crc\" DevicePath \"\"" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.564158 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.578357 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.578730 4825 scope.go:117] "RemoveContainer" containerID="cb7af0619ecda6bd3774bf7b7e15f5fb0ea03c7195bbf641571c746825be6513" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.596515 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 19:20:54 crc kubenswrapper[4825]: E1007 19:20:54.597010 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3270262-7d96-4271-8be1-c01393ae7bc4" containerName="nova-api-api" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.597024 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3270262-7d96-4271-8be1-c01393ae7bc4" containerName="nova-api-api" Oct 07 19:20:54 crc kubenswrapper[4825]: E1007 19:20:54.597037 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a92ef064-089b-4c42-b0b4-dcd9c54d75a3" containerName="nova-scheduler-scheduler" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.597049 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92ef064-089b-4c42-b0b4-dcd9c54d75a3" containerName="nova-scheduler-scheduler" Oct 07 19:20:54 crc kubenswrapper[4825]: E1007 19:20:54.597059 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3270262-7d96-4271-8be1-c01393ae7bc4" containerName="nova-api-log" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.597067 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3270262-7d96-4271-8be1-c01393ae7bc4" containerName="nova-api-log" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.597338 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3270262-7d96-4271-8be1-c01393ae7bc4" containerName="nova-api-log" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.597365 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3270262-7d96-4271-8be1-c01393ae7bc4" containerName="nova-api-api" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.597382 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a92ef064-089b-4c42-b0b4-dcd9c54d75a3" containerName="nova-scheduler-scheduler" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.605896 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.613677 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.620190 4825 scope.go:117] "RemoveContainer" containerID="83cb41800a302d9b3d62768b0927e59bf89153bdb24a284c364aa4a3bbc2f35d" Oct 07 19:20:54 crc kubenswrapper[4825]: E1007 19:20:54.621551 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83cb41800a302d9b3d62768b0927e59bf89153bdb24a284c364aa4a3bbc2f35d\": container with ID starting with 83cb41800a302d9b3d62768b0927e59bf89153bdb24a284c364aa4a3bbc2f35d not found: ID does not exist" containerID="83cb41800a302d9b3d62768b0927e59bf89153bdb24a284c364aa4a3bbc2f35d" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.621604 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83cb41800a302d9b3d62768b0927e59bf89153bdb24a284c364aa4a3bbc2f35d"} err="failed to get container status \"83cb41800a302d9b3d62768b0927e59bf89153bdb24a284c364aa4a3bbc2f35d\": rpc error: code = NotFound desc = could not find container \"83cb41800a302d9b3d62768b0927e59bf89153bdb24a284c364aa4a3bbc2f35d\": container with ID starting with 83cb41800a302d9b3d62768b0927e59bf89153bdb24a284c364aa4a3bbc2f35d not found: ID does not exist" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.621629 4825 scope.go:117] "RemoveContainer" containerID="cb7af0619ecda6bd3774bf7b7e15f5fb0ea03c7195bbf641571c746825be6513" Oct 07 19:20:54 crc kubenswrapper[4825]: E1007 19:20:54.622645 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb7af0619ecda6bd3774bf7b7e15f5fb0ea03c7195bbf641571c746825be6513\": container with ID starting with cb7af0619ecda6bd3774bf7b7e15f5fb0ea03c7195bbf641571c746825be6513 not found: ID does not exist" containerID="cb7af0619ecda6bd3774bf7b7e15f5fb0ea03c7195bbf641571c746825be6513" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.622697 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb7af0619ecda6bd3774bf7b7e15f5fb0ea03c7195bbf641571c746825be6513"} err="failed to get container status \"cb7af0619ecda6bd3774bf7b7e15f5fb0ea03c7195bbf641571c746825be6513\": rpc error: code = NotFound desc = could not find container \"cb7af0619ecda6bd3774bf7b7e15f5fb0ea03c7195bbf641571c746825be6513\": container with ID starting with cb7af0619ecda6bd3774bf7b7e15f5fb0ea03c7195bbf641571c746825be6513 not found: ID does not exist" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.628295 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.661814 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.670297 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.677153 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.679161 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.682047 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.688103 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.718689 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5db0052e-c084-496c-bd2a-798eedf8459c-config-data\") pod \"nova-api-0\" (UID: \"5db0052e-c084-496c-bd2a-798eedf8459c\") " pod="openstack/nova-api-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.718729 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db0052e-c084-496c-bd2a-798eedf8459c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5db0052e-c084-496c-bd2a-798eedf8459c\") " pod="openstack/nova-api-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.718753 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624e8227-71a0-4baa-883d-5f5f9cab15af-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"624e8227-71a0-4baa-883d-5f5f9cab15af\") " pod="openstack/nova-scheduler-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.718973 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kng26\" (UniqueName: \"kubernetes.io/projected/5db0052e-c084-496c-bd2a-798eedf8459c-kube-api-access-kng26\") pod \"nova-api-0\" (UID: \"5db0052e-c084-496c-bd2a-798eedf8459c\") " pod="openstack/nova-api-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.719045 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624e8227-71a0-4baa-883d-5f5f9cab15af-config-data\") pod \"nova-scheduler-0\" (UID: \"624e8227-71a0-4baa-883d-5f5f9cab15af\") " pod="openstack/nova-scheduler-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.719347 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nczqp\" (UniqueName: \"kubernetes.io/projected/624e8227-71a0-4baa-883d-5f5f9cab15af-kube-api-access-nczqp\") pod \"nova-scheduler-0\" (UID: \"624e8227-71a0-4baa-883d-5f5f9cab15af\") " pod="openstack/nova-scheduler-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.719440 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5db0052e-c084-496c-bd2a-798eedf8459c-logs\") pod \"nova-api-0\" (UID: \"5db0052e-c084-496c-bd2a-798eedf8459c\") " pod="openstack/nova-api-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.821718 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5db0052e-c084-496c-bd2a-798eedf8459c-logs\") pod \"nova-api-0\" (UID: \"5db0052e-c084-496c-bd2a-798eedf8459c\") " pod="openstack/nova-api-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.821838 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5db0052e-c084-496c-bd2a-798eedf8459c-config-data\") pod \"nova-api-0\" (UID: \"5db0052e-c084-496c-bd2a-798eedf8459c\") " pod="openstack/nova-api-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.821870 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db0052e-c084-496c-bd2a-798eedf8459c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5db0052e-c084-496c-bd2a-798eedf8459c\") " pod="openstack/nova-api-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.821912 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624e8227-71a0-4baa-883d-5f5f9cab15af-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"624e8227-71a0-4baa-883d-5f5f9cab15af\") " pod="openstack/nova-scheduler-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.822667 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5db0052e-c084-496c-bd2a-798eedf8459c-logs\") pod \"nova-api-0\" (UID: \"5db0052e-c084-496c-bd2a-798eedf8459c\") " pod="openstack/nova-api-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.823262 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kng26\" (UniqueName: \"kubernetes.io/projected/5db0052e-c084-496c-bd2a-798eedf8459c-kube-api-access-kng26\") pod \"nova-api-0\" (UID: \"5db0052e-c084-496c-bd2a-798eedf8459c\") " pod="openstack/nova-api-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.823336 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624e8227-71a0-4baa-883d-5f5f9cab15af-config-data\") pod \"nova-scheduler-0\" (UID: \"624e8227-71a0-4baa-883d-5f5f9cab15af\") " pod="openstack/nova-scheduler-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.824497 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nczqp\" (UniqueName: \"kubernetes.io/projected/624e8227-71a0-4baa-883d-5f5f9cab15af-kube-api-access-nczqp\") pod \"nova-scheduler-0\" (UID: \"624e8227-71a0-4baa-883d-5f5f9cab15af\") " pod="openstack/nova-scheduler-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.826106 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5db0052e-c084-496c-bd2a-798eedf8459c-config-data\") pod \"nova-api-0\" (UID: \"5db0052e-c084-496c-bd2a-798eedf8459c\") " pod="openstack/nova-api-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.826201 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624e8227-71a0-4baa-883d-5f5f9cab15af-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"624e8227-71a0-4baa-883d-5f5f9cab15af\") " pod="openstack/nova-scheduler-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.826496 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db0052e-c084-496c-bd2a-798eedf8459c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5db0052e-c084-496c-bd2a-798eedf8459c\") " pod="openstack/nova-api-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.830432 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624e8227-71a0-4baa-883d-5f5f9cab15af-config-data\") pod \"nova-scheduler-0\" (UID: \"624e8227-71a0-4baa-883d-5f5f9cab15af\") " pod="openstack/nova-scheduler-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.842272 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nczqp\" (UniqueName: \"kubernetes.io/projected/624e8227-71a0-4baa-883d-5f5f9cab15af-kube-api-access-nczqp\") pod \"nova-scheduler-0\" (UID: \"624e8227-71a0-4baa-883d-5f5f9cab15af\") " pod="openstack/nova-scheduler-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.855778 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kng26\" (UniqueName: \"kubernetes.io/projected/5db0052e-c084-496c-bd2a-798eedf8459c-kube-api-access-kng26\") pod \"nova-api-0\" (UID: \"5db0052e-c084-496c-bd2a-798eedf8459c\") " pod="openstack/nova-api-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.926662 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.937107 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.938568 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 19:20:54 crc kubenswrapper[4825]: I1007 19:20:54.997382 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 19:20:55 crc kubenswrapper[4825]: W1007 19:20:55.439254 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod624e8227_71a0_4baa_883d_5f5f9cab15af.slice/crio-d5b71a5e05a967f8842cd6b63720f2202ea605f55369653512513d1026412889 WatchSource:0}: Error finding container d5b71a5e05a967f8842cd6b63720f2202ea605f55369653512513d1026412889: Status 404 returned error can't find the container with id d5b71a5e05a967f8842cd6b63720f2202ea605f55369653512513d1026412889 Oct 07 19:20:55 crc kubenswrapper[4825]: I1007 19:20:55.440687 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 19:20:55 crc kubenswrapper[4825]: I1007 19:20:55.512630 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 19:20:55 crc kubenswrapper[4825]: W1007 19:20:55.520374 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5db0052e_c084_496c_bd2a_798eedf8459c.slice/crio-64d5dda7f6094cb3d1a4efd7af6b14fc5916d86ec89447371e89a3a60fa0d476 WatchSource:0}: Error finding container 64d5dda7f6094cb3d1a4efd7af6b14fc5916d86ec89447371e89a3a60fa0d476: Status 404 returned error can't find the container with id 64d5dda7f6094cb3d1a4efd7af6b14fc5916d86ec89447371e89a3a60fa0d476 Oct 07 19:20:55 crc kubenswrapper[4825]: I1007 19:20:55.531312 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"624e8227-71a0-4baa-883d-5f5f9cab15af","Type":"ContainerStarted","Data":"d5b71a5e05a967f8842cd6b63720f2202ea605f55369653512513d1026412889"} Oct 07 19:20:55 crc kubenswrapper[4825]: I1007 19:20:55.808525 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a92ef064-089b-4c42-b0b4-dcd9c54d75a3" path="/var/lib/kubelet/pods/a92ef064-089b-4c42-b0b4-dcd9c54d75a3/volumes" Oct 07 19:20:55 crc kubenswrapper[4825]: I1007 19:20:55.810397 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3270262-7d96-4271-8be1-c01393ae7bc4" path="/var/lib/kubelet/pods/e3270262-7d96-4271-8be1-c01393ae7bc4/volumes" Oct 07 19:20:56 crc kubenswrapper[4825]: I1007 19:20:56.542923 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5db0052e-c084-496c-bd2a-798eedf8459c","Type":"ContainerStarted","Data":"77e022dd4cc7e8fce1559017af5b5395ee2904e4588210338e15545577ffd677"} Oct 07 19:20:56 crc kubenswrapper[4825]: I1007 19:20:56.542968 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5db0052e-c084-496c-bd2a-798eedf8459c","Type":"ContainerStarted","Data":"8a629d79648c71fd5753a6e42d4edd8027cb76ae44823bca4d34a4c9bd3a7174"} Oct 07 19:20:56 crc kubenswrapper[4825]: I1007 19:20:56.542981 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5db0052e-c084-496c-bd2a-798eedf8459c","Type":"ContainerStarted","Data":"64d5dda7f6094cb3d1a4efd7af6b14fc5916d86ec89447371e89a3a60fa0d476"} Oct 07 19:20:56 crc kubenswrapper[4825]: I1007 19:20:56.544852 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"624e8227-71a0-4baa-883d-5f5f9cab15af","Type":"ContainerStarted","Data":"d470b8db9b6d817181111638be9f93c24e7ef0d4077b24d541a9ec238db6873f"} Oct 07 19:20:56 crc kubenswrapper[4825]: I1007 19:20:56.573267 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.573223013 podStartE2EDuration="2.573223013s" podCreationTimestamp="2025-10-07 19:20:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:20:56.561747734 +0000 UTC m=+1245.383786401" watchObservedRunningTime="2025-10-07 19:20:56.573223013 +0000 UTC m=+1245.395261670" Oct 07 19:20:59 crc kubenswrapper[4825]: I1007 19:20:59.909077 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 07 19:20:59 crc kubenswrapper[4825]: I1007 19:20:59.927933 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 19:20:59 crc kubenswrapper[4825]: I1007 19:20:59.937078 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=5.937050909 podStartE2EDuration="5.937050909s" podCreationTimestamp="2025-10-07 19:20:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:20:56.588484851 +0000 UTC m=+1245.410523498" watchObservedRunningTime="2025-10-07 19:20:59.937050909 +0000 UTC m=+1248.759089586" Oct 07 19:20:59 crc kubenswrapper[4825]: I1007 19:20:59.937614 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 19:20:59 crc kubenswrapper[4825]: I1007 19:20:59.937779 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 19:21:00 crc kubenswrapper[4825]: I1007 19:21:00.964414 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2881a5f6-3d20-4bbb-a82e-9aafb0076ca7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 19:21:00 crc kubenswrapper[4825]: I1007 19:21:00.964414 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2881a5f6-3d20-4bbb-a82e-9aafb0076ca7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 19:21:04 crc kubenswrapper[4825]: I1007 19:21:04.927708 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 07 19:21:04 crc kubenswrapper[4825]: I1007 19:21:04.965516 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 07 19:21:04 crc kubenswrapper[4825]: I1007 19:21:04.998134 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 19:21:04 crc kubenswrapper[4825]: I1007 19:21:04.998280 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 19:21:05 crc kubenswrapper[4825]: I1007 19:21:05.696310 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 07 19:21:06 crc kubenswrapper[4825]: I1007 19:21:06.080841 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5db0052e-c084-496c-bd2a-798eedf8459c" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 19:21:06 crc kubenswrapper[4825]: I1007 19:21:06.080945 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5db0052e-c084-496c-bd2a-798eedf8459c" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 19:21:09 crc kubenswrapper[4825]: I1007 19:21:09.944500 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 19:21:09 crc kubenswrapper[4825]: I1007 19:21:09.944994 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 19:21:09 crc kubenswrapper[4825]: I1007 19:21:09.952974 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 19:21:09 crc kubenswrapper[4825]: I1007 19:21:09.967297 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 19:21:12 crc kubenswrapper[4825]: I1007 19:21:12.738905 4825 generic.go:334] "Generic (PLEG): container finished" podID="bc70b027-cab7-490d-a529-e74243d17ede" containerID="18aecafa05947d211ade57c22b90f108df42f5bd60805ded5381dbd3ecc06747" exitCode=137 Oct 07 19:21:12 crc kubenswrapper[4825]: I1007 19:21:12.739004 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bc70b027-cab7-490d-a529-e74243d17ede","Type":"ContainerDied","Data":"18aecafa05947d211ade57c22b90f108df42f5bd60805ded5381dbd3ecc06747"} Oct 07 19:21:12 crc kubenswrapper[4825]: I1007 19:21:12.842068 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:21:12 crc kubenswrapper[4825]: I1007 19:21:12.924946 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tctd\" (UniqueName: \"kubernetes.io/projected/bc70b027-cab7-490d-a529-e74243d17ede-kube-api-access-9tctd\") pod \"bc70b027-cab7-490d-a529-e74243d17ede\" (UID: \"bc70b027-cab7-490d-a529-e74243d17ede\") " Oct 07 19:21:12 crc kubenswrapper[4825]: I1007 19:21:12.925309 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc70b027-cab7-490d-a529-e74243d17ede-config-data\") pod \"bc70b027-cab7-490d-a529-e74243d17ede\" (UID: \"bc70b027-cab7-490d-a529-e74243d17ede\") " Oct 07 19:21:12 crc kubenswrapper[4825]: I1007 19:21:12.925445 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc70b027-cab7-490d-a529-e74243d17ede-combined-ca-bundle\") pod \"bc70b027-cab7-490d-a529-e74243d17ede\" (UID: \"bc70b027-cab7-490d-a529-e74243d17ede\") " Oct 07 19:21:12 crc kubenswrapper[4825]: I1007 19:21:12.931556 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc70b027-cab7-490d-a529-e74243d17ede-kube-api-access-9tctd" (OuterVolumeSpecName: "kube-api-access-9tctd") pod "bc70b027-cab7-490d-a529-e74243d17ede" (UID: "bc70b027-cab7-490d-a529-e74243d17ede"). InnerVolumeSpecName "kube-api-access-9tctd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:21:12 crc kubenswrapper[4825]: I1007 19:21:12.974417 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc70b027-cab7-490d-a529-e74243d17ede-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc70b027-cab7-490d-a529-e74243d17ede" (UID: "bc70b027-cab7-490d-a529-e74243d17ede"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:21:12 crc kubenswrapper[4825]: I1007 19:21:12.975611 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc70b027-cab7-490d-a529-e74243d17ede-config-data" (OuterVolumeSpecName: "config-data") pod "bc70b027-cab7-490d-a529-e74243d17ede" (UID: "bc70b027-cab7-490d-a529-e74243d17ede"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:21:13 crc kubenswrapper[4825]: I1007 19:21:13.027472 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tctd\" (UniqueName: \"kubernetes.io/projected/bc70b027-cab7-490d-a529-e74243d17ede-kube-api-access-9tctd\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:13 crc kubenswrapper[4825]: I1007 19:21:13.027529 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc70b027-cab7-490d-a529-e74243d17ede-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:13 crc kubenswrapper[4825]: I1007 19:21:13.027545 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc70b027-cab7-490d-a529-e74243d17ede-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:13 crc kubenswrapper[4825]: I1007 19:21:13.752498 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bc70b027-cab7-490d-a529-e74243d17ede","Type":"ContainerDied","Data":"cad22c313c51d3a2ebfc0ee6e42dc44321795c31947ca91e1680be8da379644b"} Oct 07 19:21:13 crc kubenswrapper[4825]: I1007 19:21:13.752918 4825 scope.go:117] "RemoveContainer" containerID="18aecafa05947d211ade57c22b90f108df42f5bd60805ded5381dbd3ecc06747" Oct 07 19:21:13 crc kubenswrapper[4825]: I1007 19:21:13.752580 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:21:13 crc kubenswrapper[4825]: I1007 19:21:13.827537 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 19:21:13 crc kubenswrapper[4825]: I1007 19:21:13.828513 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 19:21:13 crc kubenswrapper[4825]: I1007 19:21:13.850918 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 19:21:13 crc kubenswrapper[4825]: E1007 19:21:13.851523 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc70b027-cab7-490d-a529-e74243d17ede" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 19:21:13 crc kubenswrapper[4825]: I1007 19:21:13.851554 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc70b027-cab7-490d-a529-e74243d17ede" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 19:21:13 crc kubenswrapper[4825]: I1007 19:21:13.851867 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc70b027-cab7-490d-a529-e74243d17ede" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 19:21:13 crc kubenswrapper[4825]: I1007 19:21:13.852896 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:21:13 crc kubenswrapper[4825]: I1007 19:21:13.856278 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 07 19:21:13 crc kubenswrapper[4825]: I1007 19:21:13.856651 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 07 19:21:13 crc kubenswrapper[4825]: I1007 19:21:13.856744 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 07 19:21:13 crc kubenswrapper[4825]: I1007 19:21:13.875039 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 19:21:13 crc kubenswrapper[4825]: I1007 19:21:13.948875 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c586f2-b817-4c06-92cf-8b7832e8acc6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"89c586f2-b817-4c06-92cf-8b7832e8acc6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:21:13 crc kubenswrapper[4825]: I1007 19:21:13.948964 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xjqw\" (UniqueName: \"kubernetes.io/projected/89c586f2-b817-4c06-92cf-8b7832e8acc6-kube-api-access-6xjqw\") pod \"nova-cell1-novncproxy-0\" (UID: \"89c586f2-b817-4c06-92cf-8b7832e8acc6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:21:13 crc kubenswrapper[4825]: I1007 19:21:13.949002 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c586f2-b817-4c06-92cf-8b7832e8acc6-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"89c586f2-b817-4c06-92cf-8b7832e8acc6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:21:13 crc kubenswrapper[4825]: I1007 19:21:13.949035 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c586f2-b817-4c06-92cf-8b7832e8acc6-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"89c586f2-b817-4c06-92cf-8b7832e8acc6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:21:13 crc kubenswrapper[4825]: I1007 19:21:13.949112 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89c586f2-b817-4c06-92cf-8b7832e8acc6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"89c586f2-b817-4c06-92cf-8b7832e8acc6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:21:14 crc kubenswrapper[4825]: I1007 19:21:14.050560 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xjqw\" (UniqueName: \"kubernetes.io/projected/89c586f2-b817-4c06-92cf-8b7832e8acc6-kube-api-access-6xjqw\") pod \"nova-cell1-novncproxy-0\" (UID: \"89c586f2-b817-4c06-92cf-8b7832e8acc6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:21:14 crc kubenswrapper[4825]: I1007 19:21:14.050617 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c586f2-b817-4c06-92cf-8b7832e8acc6-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"89c586f2-b817-4c06-92cf-8b7832e8acc6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:21:14 crc kubenswrapper[4825]: I1007 19:21:14.050651 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c586f2-b817-4c06-92cf-8b7832e8acc6-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"89c586f2-b817-4c06-92cf-8b7832e8acc6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:21:14 crc kubenswrapper[4825]: I1007 19:21:14.050710 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89c586f2-b817-4c06-92cf-8b7832e8acc6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"89c586f2-b817-4c06-92cf-8b7832e8acc6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:21:14 crc kubenswrapper[4825]: I1007 19:21:14.050748 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c586f2-b817-4c06-92cf-8b7832e8acc6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"89c586f2-b817-4c06-92cf-8b7832e8acc6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:21:14 crc kubenswrapper[4825]: I1007 19:21:14.057080 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c586f2-b817-4c06-92cf-8b7832e8acc6-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"89c586f2-b817-4c06-92cf-8b7832e8acc6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:21:14 crc kubenswrapper[4825]: I1007 19:21:14.057160 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c586f2-b817-4c06-92cf-8b7832e8acc6-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"89c586f2-b817-4c06-92cf-8b7832e8acc6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:21:14 crc kubenswrapper[4825]: I1007 19:21:14.057342 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89c586f2-b817-4c06-92cf-8b7832e8acc6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"89c586f2-b817-4c06-92cf-8b7832e8acc6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:21:14 crc kubenswrapper[4825]: I1007 19:21:14.062887 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c586f2-b817-4c06-92cf-8b7832e8acc6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"89c586f2-b817-4c06-92cf-8b7832e8acc6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:21:14 crc kubenswrapper[4825]: I1007 19:21:14.067208 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xjqw\" (UniqueName: \"kubernetes.io/projected/89c586f2-b817-4c06-92cf-8b7832e8acc6-kube-api-access-6xjqw\") pod \"nova-cell1-novncproxy-0\" (UID: \"89c586f2-b817-4c06-92cf-8b7832e8acc6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:21:14 crc kubenswrapper[4825]: I1007 19:21:14.176187 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:21:14 crc kubenswrapper[4825]: I1007 19:21:14.703917 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 19:21:14 crc kubenswrapper[4825]: W1007 19:21:14.709729 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89c586f2_b817_4c06_92cf_8b7832e8acc6.slice/crio-7890f8a09c2043e22903e69608251ddef60b084d347d8ce406776572a7b2f732 WatchSource:0}: Error finding container 7890f8a09c2043e22903e69608251ddef60b084d347d8ce406776572a7b2f732: Status 404 returned error can't find the container with id 7890f8a09c2043e22903e69608251ddef60b084d347d8ce406776572a7b2f732 Oct 07 19:21:14 crc kubenswrapper[4825]: I1007 19:21:14.762940 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"89c586f2-b817-4c06-92cf-8b7832e8acc6","Type":"ContainerStarted","Data":"7890f8a09c2043e22903e69608251ddef60b084d347d8ce406776572a7b2f732"} Oct 07 19:21:15 crc kubenswrapper[4825]: I1007 19:21:15.002735 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 19:21:15 crc kubenswrapper[4825]: I1007 19:21:15.003258 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 19:21:15 crc kubenswrapper[4825]: I1007 19:21:15.003291 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 19:21:15 crc kubenswrapper[4825]: I1007 19:21:15.006185 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 19:21:15 crc kubenswrapper[4825]: I1007 19:21:15.780834 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"89c586f2-b817-4c06-92cf-8b7832e8acc6","Type":"ContainerStarted","Data":"20d3798d2b6da08a335513df0ec3ad71a11ea1d3e3213787e78acfd3f748e83b"} Oct 07 19:21:15 crc kubenswrapper[4825]: I1007 19:21:15.781493 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 19:21:15 crc kubenswrapper[4825]: I1007 19:21:15.785456 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 19:21:15 crc kubenswrapper[4825]: I1007 19:21:15.808028 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc70b027-cab7-490d-a529-e74243d17ede" path="/var/lib/kubelet/pods/bc70b027-cab7-490d-a529-e74243d17ede/volumes" Oct 07 19:21:15 crc kubenswrapper[4825]: I1007 19:21:15.812531 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.81250645 podStartE2EDuration="2.81250645s" podCreationTimestamp="2025-10-07 19:21:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:21:15.80328069 +0000 UTC m=+1264.625319367" watchObservedRunningTime="2025-10-07 19:21:15.81250645 +0000 UTC m=+1264.634545087" Oct 07 19:21:16 crc kubenswrapper[4825]: I1007 19:21:16.025768 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-l6vtv"] Oct 07 19:21:16 crc kubenswrapper[4825]: I1007 19:21:16.038999 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" Oct 07 19:21:16 crc kubenswrapper[4825]: I1007 19:21:16.053822 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-l6vtv"] Oct 07 19:21:16 crc kubenswrapper[4825]: I1007 19:21:16.091770 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-l6vtv\" (UID: \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" Oct 07 19:21:16 crc kubenswrapper[4825]: I1007 19:21:16.091829 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-l6vtv\" (UID: \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" Oct 07 19:21:16 crc kubenswrapper[4825]: I1007 19:21:16.091860 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc2n7\" (UniqueName: \"kubernetes.io/projected/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-kube-api-access-zc2n7\") pod \"dnsmasq-dns-59cf4bdb65-l6vtv\" (UID: \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" Oct 07 19:21:16 crc kubenswrapper[4825]: I1007 19:21:16.091882 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-l6vtv\" (UID: \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" Oct 07 19:21:16 crc kubenswrapper[4825]: I1007 19:21:16.091918 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-l6vtv\" (UID: \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" Oct 07 19:21:16 crc kubenswrapper[4825]: I1007 19:21:16.091951 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-config\") pod \"dnsmasq-dns-59cf4bdb65-l6vtv\" (UID: \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" Oct 07 19:21:16 crc kubenswrapper[4825]: I1007 19:21:16.196266 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-l6vtv\" (UID: \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" Oct 07 19:21:16 crc kubenswrapper[4825]: I1007 19:21:16.196315 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-l6vtv\" (UID: \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" Oct 07 19:21:16 crc kubenswrapper[4825]: I1007 19:21:16.196354 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc2n7\" (UniqueName: \"kubernetes.io/projected/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-kube-api-access-zc2n7\") pod \"dnsmasq-dns-59cf4bdb65-l6vtv\" (UID: \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" Oct 07 19:21:16 crc kubenswrapper[4825]: I1007 19:21:16.196395 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-l6vtv\" (UID: \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" Oct 07 19:21:16 crc kubenswrapper[4825]: I1007 19:21:16.196439 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-l6vtv\" (UID: \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" Oct 07 19:21:16 crc kubenswrapper[4825]: I1007 19:21:16.196481 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-config\") pod \"dnsmasq-dns-59cf4bdb65-l6vtv\" (UID: \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" Oct 07 19:21:16 crc kubenswrapper[4825]: I1007 19:21:16.197492 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-config\") pod \"dnsmasq-dns-59cf4bdb65-l6vtv\" (UID: \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" Oct 07 19:21:16 crc kubenswrapper[4825]: I1007 19:21:16.198317 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-l6vtv\" (UID: \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" Oct 07 19:21:16 crc kubenswrapper[4825]: I1007 19:21:16.198893 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-l6vtv\" (UID: \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" Oct 07 19:21:16 crc kubenswrapper[4825]: I1007 19:21:16.199704 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-l6vtv\" (UID: \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" Oct 07 19:21:16 crc kubenswrapper[4825]: I1007 19:21:16.200275 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-l6vtv\" (UID: \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" Oct 07 19:21:16 crc kubenswrapper[4825]: I1007 19:21:16.231675 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc2n7\" (UniqueName: \"kubernetes.io/projected/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-kube-api-access-zc2n7\") pod \"dnsmasq-dns-59cf4bdb65-l6vtv\" (UID: \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" Oct 07 19:21:16 crc kubenswrapper[4825]: I1007 19:21:16.396846 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" Oct 07 19:21:16 crc kubenswrapper[4825]: I1007 19:21:16.934151 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-l6vtv"] Oct 07 19:21:17 crc kubenswrapper[4825]: I1007 19:21:17.804022 4825 generic.go:334] "Generic (PLEG): container finished" podID="68dafb60-bc0b-4329-9114-91a5b6d5bfe8" containerID="6a48430f847ce6c39c66e614a973d54ad1ebe29392a4bc1b8a691c87d0dbfb60" exitCode=0 Oct 07 19:21:17 crc kubenswrapper[4825]: I1007 19:21:17.823095 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" event={"ID":"68dafb60-bc0b-4329-9114-91a5b6d5bfe8","Type":"ContainerDied","Data":"6a48430f847ce6c39c66e614a973d54ad1ebe29392a4bc1b8a691c87d0dbfb60"} Oct 07 19:21:17 crc kubenswrapper[4825]: I1007 19:21:17.824061 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" event={"ID":"68dafb60-bc0b-4329-9114-91a5b6d5bfe8","Type":"ContainerStarted","Data":"7db2880860662f1be563985de336c2ff15c7a2d6ba89409d24b34301777afd39"} Oct 07 19:21:17 crc kubenswrapper[4825]: I1007 19:21:17.913126 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:21:17 crc kubenswrapper[4825]: I1007 19:21:17.913484 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1e3f5992-6777-4a65-b0f2-2510b363b9bf" containerName="ceilometer-central-agent" containerID="cri-o://be34757daaae4d77314af2ecb9877c1dc99dd569f5b016067b6ff2a40eb57c1f" gracePeriod=30 Oct 07 19:21:17 crc kubenswrapper[4825]: I1007 19:21:17.913899 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1e3f5992-6777-4a65-b0f2-2510b363b9bf" containerName="proxy-httpd" containerID="cri-o://8f1cf1b07206834cf406f60bb880d5ad6158177e084f3331ea9deae2db7d2102" gracePeriod=30 Oct 07 19:21:17 crc kubenswrapper[4825]: I1007 19:21:17.913958 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1e3f5992-6777-4a65-b0f2-2510b363b9bf" containerName="sg-core" containerID="cri-o://dbd2151af2d21196aacee3d29dc47d2ede9736876a10148966f348386fd0f9f8" gracePeriod=30 Oct 07 19:21:17 crc kubenswrapper[4825]: I1007 19:21:17.913991 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1e3f5992-6777-4a65-b0f2-2510b363b9bf" containerName="ceilometer-notification-agent" containerID="cri-o://df8fda5ab9c6f67358652fbd71becdb48b8deb099386ecb41a61fdbd12684530" gracePeriod=30 Oct 07 19:21:18 crc kubenswrapper[4825]: I1007 19:21:18.631324 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 19:21:18 crc kubenswrapper[4825]: I1007 19:21:18.827908 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" event={"ID":"68dafb60-bc0b-4329-9114-91a5b6d5bfe8","Type":"ContainerStarted","Data":"cf0f8eb92d6c94287a0a3820a9a1ff21119291d7547c2662cf3b54fe0e6d8ff8"} Oct 07 19:21:18 crc kubenswrapper[4825]: I1007 19:21:18.828685 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" Oct 07 19:21:18 crc kubenswrapper[4825]: I1007 19:21:18.834441 4825 generic.go:334] "Generic (PLEG): container finished" podID="1e3f5992-6777-4a65-b0f2-2510b363b9bf" containerID="8f1cf1b07206834cf406f60bb880d5ad6158177e084f3331ea9deae2db7d2102" exitCode=0 Oct 07 19:21:18 crc kubenswrapper[4825]: I1007 19:21:18.834471 4825 generic.go:334] "Generic (PLEG): container finished" podID="1e3f5992-6777-4a65-b0f2-2510b363b9bf" containerID="dbd2151af2d21196aacee3d29dc47d2ede9736876a10148966f348386fd0f9f8" exitCode=2 Oct 07 19:21:18 crc kubenswrapper[4825]: I1007 19:21:18.834499 4825 generic.go:334] "Generic (PLEG): container finished" podID="1e3f5992-6777-4a65-b0f2-2510b363b9bf" containerID="be34757daaae4d77314af2ecb9877c1dc99dd569f5b016067b6ff2a40eb57c1f" exitCode=0 Oct 07 19:21:18 crc kubenswrapper[4825]: I1007 19:21:18.834551 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e3f5992-6777-4a65-b0f2-2510b363b9bf","Type":"ContainerDied","Data":"8f1cf1b07206834cf406f60bb880d5ad6158177e084f3331ea9deae2db7d2102"} Oct 07 19:21:18 crc kubenswrapper[4825]: I1007 19:21:18.834618 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e3f5992-6777-4a65-b0f2-2510b363b9bf","Type":"ContainerDied","Data":"dbd2151af2d21196aacee3d29dc47d2ede9736876a10148966f348386fd0f9f8"} Oct 07 19:21:18 crc kubenswrapper[4825]: I1007 19:21:18.834637 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e3f5992-6777-4a65-b0f2-2510b363b9bf","Type":"ContainerDied","Data":"be34757daaae4d77314af2ecb9877c1dc99dd569f5b016067b6ff2a40eb57c1f"} Oct 07 19:21:18 crc kubenswrapper[4825]: I1007 19:21:18.834733 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5db0052e-c084-496c-bd2a-798eedf8459c" containerName="nova-api-log" containerID="cri-o://8a629d79648c71fd5753a6e42d4edd8027cb76ae44823bca4d34a4c9bd3a7174" gracePeriod=30 Oct 07 19:21:18 crc kubenswrapper[4825]: I1007 19:21:18.834767 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5db0052e-c084-496c-bd2a-798eedf8459c" containerName="nova-api-api" containerID="cri-o://77e022dd4cc7e8fce1559017af5b5395ee2904e4588210338e15545577ffd677" gracePeriod=30 Oct 07 19:21:18 crc kubenswrapper[4825]: I1007 19:21:18.855213 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" podStartSLOduration=3.855192929 podStartE2EDuration="3.855192929s" podCreationTimestamp="2025-10-07 19:21:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:21:18.846075813 +0000 UTC m=+1267.668114460" watchObservedRunningTime="2025-10-07 19:21:18.855192929 +0000 UTC m=+1267.677231566" Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.176646 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.780420 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.851350 4825 generic.go:334] "Generic (PLEG): container finished" podID="1e3f5992-6777-4a65-b0f2-2510b363b9bf" containerID="df8fda5ab9c6f67358652fbd71becdb48b8deb099386ecb41a61fdbd12684530" exitCode=0 Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.851440 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e3f5992-6777-4a65-b0f2-2510b363b9bf","Type":"ContainerDied","Data":"df8fda5ab9c6f67358652fbd71becdb48b8deb099386ecb41a61fdbd12684530"} Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.851442 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.851488 4825 scope.go:117] "RemoveContainer" containerID="8f1cf1b07206834cf406f60bb880d5ad6158177e084f3331ea9deae2db7d2102" Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.851475 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e3f5992-6777-4a65-b0f2-2510b363b9bf","Type":"ContainerDied","Data":"01ca31fb3a0dfcda7f5eb72fd774fbdc52dd6f1c6da4e8269bacff612a37f3cd"} Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.854264 4825 generic.go:334] "Generic (PLEG): container finished" podID="5db0052e-c084-496c-bd2a-798eedf8459c" containerID="8a629d79648c71fd5753a6e42d4edd8027cb76ae44823bca4d34a4c9bd3a7174" exitCode=143 Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.854367 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5db0052e-c084-496c-bd2a-798eedf8459c","Type":"ContainerDied","Data":"8a629d79648c71fd5753a6e42d4edd8027cb76ae44823bca4d34a4c9bd3a7174"} Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.858367 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-combined-ca-bundle\") pod \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.858422 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e3f5992-6777-4a65-b0f2-2510b363b9bf-run-httpd\") pod \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.858537 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-ceilometer-tls-certs\") pod \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.858572 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-scripts\") pod \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.858599 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e3f5992-6777-4a65-b0f2-2510b363b9bf-log-httpd\") pod \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.858632 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-config-data\") pod \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.858742 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e3f5992-6777-4a65-b0f2-2510b363b9bf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1e3f5992-6777-4a65-b0f2-2510b363b9bf" (UID: "1e3f5992-6777-4a65-b0f2-2510b363b9bf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.858844 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-sg-core-conf-yaml\") pod \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.858893 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dztw5\" (UniqueName: \"kubernetes.io/projected/1e3f5992-6777-4a65-b0f2-2510b363b9bf-kube-api-access-dztw5\") pod \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\" (UID: \"1e3f5992-6777-4a65-b0f2-2510b363b9bf\") " Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.859817 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e3f5992-6777-4a65-b0f2-2510b363b9bf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1e3f5992-6777-4a65-b0f2-2510b363b9bf" (UID: "1e3f5992-6777-4a65-b0f2-2510b363b9bf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.860062 4825 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e3f5992-6777-4a65-b0f2-2510b363b9bf-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.860074 4825 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e3f5992-6777-4a65-b0f2-2510b363b9bf-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.865351 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-scripts" (OuterVolumeSpecName: "scripts") pod "1e3f5992-6777-4a65-b0f2-2510b363b9bf" (UID: "1e3f5992-6777-4a65-b0f2-2510b363b9bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.865471 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e3f5992-6777-4a65-b0f2-2510b363b9bf-kube-api-access-dztw5" (OuterVolumeSpecName: "kube-api-access-dztw5") pod "1e3f5992-6777-4a65-b0f2-2510b363b9bf" (UID: "1e3f5992-6777-4a65-b0f2-2510b363b9bf"). InnerVolumeSpecName "kube-api-access-dztw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.890281 4825 scope.go:117] "RemoveContainer" containerID="dbd2151af2d21196aacee3d29dc47d2ede9736876a10148966f348386fd0f9f8" Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.923725 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1e3f5992-6777-4a65-b0f2-2510b363b9bf" (UID: "1e3f5992-6777-4a65-b0f2-2510b363b9bf"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.962044 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1e3f5992-6777-4a65-b0f2-2510b363b9bf" (UID: "1e3f5992-6777-4a65-b0f2-2510b363b9bf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.965788 4825 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.965823 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.966740 4825 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.966955 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dztw5\" (UniqueName: \"kubernetes.io/projected/1e3f5992-6777-4a65-b0f2-2510b363b9bf-kube-api-access-dztw5\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.974645 4825 scope.go:117] "RemoveContainer" containerID="df8fda5ab9c6f67358652fbd71becdb48b8deb099386ecb41a61fdbd12684530" Oct 07 19:21:19 crc kubenswrapper[4825]: I1007 19:21:19.984006 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e3f5992-6777-4a65-b0f2-2510b363b9bf" (UID: "1e3f5992-6777-4a65-b0f2-2510b363b9bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.006594 4825 scope.go:117] "RemoveContainer" containerID="be34757daaae4d77314af2ecb9877c1dc99dd569f5b016067b6ff2a40eb57c1f" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.037206 4825 scope.go:117] "RemoveContainer" containerID="8f1cf1b07206834cf406f60bb880d5ad6158177e084f3331ea9deae2db7d2102" Oct 07 19:21:20 crc kubenswrapper[4825]: E1007 19:21:20.040972 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f1cf1b07206834cf406f60bb880d5ad6158177e084f3331ea9deae2db7d2102\": container with ID starting with 8f1cf1b07206834cf406f60bb880d5ad6158177e084f3331ea9deae2db7d2102 not found: ID does not exist" containerID="8f1cf1b07206834cf406f60bb880d5ad6158177e084f3331ea9deae2db7d2102" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.041009 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f1cf1b07206834cf406f60bb880d5ad6158177e084f3331ea9deae2db7d2102"} err="failed to get container status \"8f1cf1b07206834cf406f60bb880d5ad6158177e084f3331ea9deae2db7d2102\": rpc error: code = NotFound desc = could not find container \"8f1cf1b07206834cf406f60bb880d5ad6158177e084f3331ea9deae2db7d2102\": container with ID starting with 8f1cf1b07206834cf406f60bb880d5ad6158177e084f3331ea9deae2db7d2102 not found: ID does not exist" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.041134 4825 scope.go:117] "RemoveContainer" containerID="dbd2151af2d21196aacee3d29dc47d2ede9736876a10148966f348386fd0f9f8" Oct 07 19:21:20 crc kubenswrapper[4825]: E1007 19:21:20.041519 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbd2151af2d21196aacee3d29dc47d2ede9736876a10148966f348386fd0f9f8\": container with ID starting with dbd2151af2d21196aacee3d29dc47d2ede9736876a10148966f348386fd0f9f8 not found: ID does not exist" containerID="dbd2151af2d21196aacee3d29dc47d2ede9736876a10148966f348386fd0f9f8" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.041556 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbd2151af2d21196aacee3d29dc47d2ede9736876a10148966f348386fd0f9f8"} err="failed to get container status \"dbd2151af2d21196aacee3d29dc47d2ede9736876a10148966f348386fd0f9f8\": rpc error: code = NotFound desc = could not find container \"dbd2151af2d21196aacee3d29dc47d2ede9736876a10148966f348386fd0f9f8\": container with ID starting with dbd2151af2d21196aacee3d29dc47d2ede9736876a10148966f348386fd0f9f8 not found: ID does not exist" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.041583 4825 scope.go:117] "RemoveContainer" containerID="df8fda5ab9c6f67358652fbd71becdb48b8deb099386ecb41a61fdbd12684530" Oct 07 19:21:20 crc kubenswrapper[4825]: E1007 19:21:20.044840 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df8fda5ab9c6f67358652fbd71becdb48b8deb099386ecb41a61fdbd12684530\": container with ID starting with df8fda5ab9c6f67358652fbd71becdb48b8deb099386ecb41a61fdbd12684530 not found: ID does not exist" containerID="df8fda5ab9c6f67358652fbd71becdb48b8deb099386ecb41a61fdbd12684530" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.045352 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df8fda5ab9c6f67358652fbd71becdb48b8deb099386ecb41a61fdbd12684530"} err="failed to get container status \"df8fda5ab9c6f67358652fbd71becdb48b8deb099386ecb41a61fdbd12684530\": rpc error: code = NotFound desc = could not find container \"df8fda5ab9c6f67358652fbd71becdb48b8deb099386ecb41a61fdbd12684530\": container with ID starting with df8fda5ab9c6f67358652fbd71becdb48b8deb099386ecb41a61fdbd12684530 not found: ID does not exist" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.045374 4825 scope.go:117] "RemoveContainer" containerID="be34757daaae4d77314af2ecb9877c1dc99dd569f5b016067b6ff2a40eb57c1f" Oct 07 19:21:20 crc kubenswrapper[4825]: E1007 19:21:20.053775 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be34757daaae4d77314af2ecb9877c1dc99dd569f5b016067b6ff2a40eb57c1f\": container with ID starting with be34757daaae4d77314af2ecb9877c1dc99dd569f5b016067b6ff2a40eb57c1f not found: ID does not exist" containerID="be34757daaae4d77314af2ecb9877c1dc99dd569f5b016067b6ff2a40eb57c1f" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.053844 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be34757daaae4d77314af2ecb9877c1dc99dd569f5b016067b6ff2a40eb57c1f"} err="failed to get container status \"be34757daaae4d77314af2ecb9877c1dc99dd569f5b016067b6ff2a40eb57c1f\": rpc error: code = NotFound desc = could not find container \"be34757daaae4d77314af2ecb9877c1dc99dd569f5b016067b6ff2a40eb57c1f\": container with ID starting with be34757daaae4d77314af2ecb9877c1dc99dd569f5b016067b6ff2a40eb57c1f not found: ID does not exist" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.055745 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-config-data" (OuterVolumeSpecName: "config-data") pod "1e3f5992-6777-4a65-b0f2-2510b363b9bf" (UID: "1e3f5992-6777-4a65-b0f2-2510b363b9bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.104957 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.104996 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e3f5992-6777-4a65-b0f2-2510b363b9bf-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.191896 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.221753 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.230906 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:21:20 crc kubenswrapper[4825]: E1007 19:21:20.231350 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e3f5992-6777-4a65-b0f2-2510b363b9bf" containerName="ceilometer-notification-agent" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.231368 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e3f5992-6777-4a65-b0f2-2510b363b9bf" containerName="ceilometer-notification-agent" Oct 07 19:21:20 crc kubenswrapper[4825]: E1007 19:21:20.231382 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e3f5992-6777-4a65-b0f2-2510b363b9bf" containerName="proxy-httpd" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.231389 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e3f5992-6777-4a65-b0f2-2510b363b9bf" containerName="proxy-httpd" Oct 07 19:21:20 crc kubenswrapper[4825]: E1007 19:21:20.231403 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e3f5992-6777-4a65-b0f2-2510b363b9bf" containerName="sg-core" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.231409 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e3f5992-6777-4a65-b0f2-2510b363b9bf" containerName="sg-core" Oct 07 19:21:20 crc kubenswrapper[4825]: E1007 19:21:20.231439 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e3f5992-6777-4a65-b0f2-2510b363b9bf" containerName="ceilometer-central-agent" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.231445 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e3f5992-6777-4a65-b0f2-2510b363b9bf" containerName="ceilometer-central-agent" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.231612 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e3f5992-6777-4a65-b0f2-2510b363b9bf" containerName="proxy-httpd" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.231667 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e3f5992-6777-4a65-b0f2-2510b363b9bf" containerName="ceilometer-central-agent" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.231676 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e3f5992-6777-4a65-b0f2-2510b363b9bf" containerName="sg-core" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.231688 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e3f5992-6777-4a65-b0f2-2510b363b9bf" containerName="ceilometer-notification-agent" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.246770 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.247862 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.252769 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.252927 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.253191 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.307676 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5553aae5-6efa-4d21-bbb7-f2c0f23071b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5553aae5-6efa-4d21-bbb7-f2c0f23071b3\") " pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.307737 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5553aae5-6efa-4d21-bbb7-f2c0f23071b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5553aae5-6efa-4d21-bbb7-f2c0f23071b3\") " pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.307763 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5553aae5-6efa-4d21-bbb7-f2c0f23071b3-config-data\") pod \"ceilometer-0\" (UID: \"5553aae5-6efa-4d21-bbb7-f2c0f23071b3\") " pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.307827 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5553aae5-6efa-4d21-bbb7-f2c0f23071b3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5553aae5-6efa-4d21-bbb7-f2c0f23071b3\") " pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.307846 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5553aae5-6efa-4d21-bbb7-f2c0f23071b3-run-httpd\") pod \"ceilometer-0\" (UID: \"5553aae5-6efa-4d21-bbb7-f2c0f23071b3\") " pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.307907 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5553aae5-6efa-4d21-bbb7-f2c0f23071b3-log-httpd\") pod \"ceilometer-0\" (UID: \"5553aae5-6efa-4d21-bbb7-f2c0f23071b3\") " pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.307958 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs4nl\" (UniqueName: \"kubernetes.io/projected/5553aae5-6efa-4d21-bbb7-f2c0f23071b3-kube-api-access-zs4nl\") pod \"ceilometer-0\" (UID: \"5553aae5-6efa-4d21-bbb7-f2c0f23071b3\") " pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.308064 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5553aae5-6efa-4d21-bbb7-f2c0f23071b3-scripts\") pod \"ceilometer-0\" (UID: \"5553aae5-6efa-4d21-bbb7-f2c0f23071b3\") " pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.409645 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5553aae5-6efa-4d21-bbb7-f2c0f23071b3-log-httpd\") pod \"ceilometer-0\" (UID: \"5553aae5-6efa-4d21-bbb7-f2c0f23071b3\") " pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.409718 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs4nl\" (UniqueName: \"kubernetes.io/projected/5553aae5-6efa-4d21-bbb7-f2c0f23071b3-kube-api-access-zs4nl\") pod \"ceilometer-0\" (UID: \"5553aae5-6efa-4d21-bbb7-f2c0f23071b3\") " pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.409800 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5553aae5-6efa-4d21-bbb7-f2c0f23071b3-scripts\") pod \"ceilometer-0\" (UID: \"5553aae5-6efa-4d21-bbb7-f2c0f23071b3\") " pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.409835 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5553aae5-6efa-4d21-bbb7-f2c0f23071b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5553aae5-6efa-4d21-bbb7-f2c0f23071b3\") " pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.409859 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5553aae5-6efa-4d21-bbb7-f2c0f23071b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5553aae5-6efa-4d21-bbb7-f2c0f23071b3\") " pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.409875 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5553aae5-6efa-4d21-bbb7-f2c0f23071b3-config-data\") pod \"ceilometer-0\" (UID: \"5553aae5-6efa-4d21-bbb7-f2c0f23071b3\") " pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.409894 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5553aae5-6efa-4d21-bbb7-f2c0f23071b3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5553aae5-6efa-4d21-bbb7-f2c0f23071b3\") " pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.409911 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5553aae5-6efa-4d21-bbb7-f2c0f23071b3-run-httpd\") pod \"ceilometer-0\" (UID: \"5553aae5-6efa-4d21-bbb7-f2c0f23071b3\") " pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.410310 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5553aae5-6efa-4d21-bbb7-f2c0f23071b3-run-httpd\") pod \"ceilometer-0\" (UID: \"5553aae5-6efa-4d21-bbb7-f2c0f23071b3\") " pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.410494 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5553aae5-6efa-4d21-bbb7-f2c0f23071b3-log-httpd\") pod \"ceilometer-0\" (UID: \"5553aae5-6efa-4d21-bbb7-f2c0f23071b3\") " pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.415043 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5553aae5-6efa-4d21-bbb7-f2c0f23071b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5553aae5-6efa-4d21-bbb7-f2c0f23071b3\") " pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.415551 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5553aae5-6efa-4d21-bbb7-f2c0f23071b3-scripts\") pod \"ceilometer-0\" (UID: \"5553aae5-6efa-4d21-bbb7-f2c0f23071b3\") " pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.415772 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5553aae5-6efa-4d21-bbb7-f2c0f23071b3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5553aae5-6efa-4d21-bbb7-f2c0f23071b3\") " pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.416959 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5553aae5-6efa-4d21-bbb7-f2c0f23071b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5553aae5-6efa-4d21-bbb7-f2c0f23071b3\") " pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.417069 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5553aae5-6efa-4d21-bbb7-f2c0f23071b3-config-data\") pod \"ceilometer-0\" (UID: \"5553aae5-6efa-4d21-bbb7-f2c0f23071b3\") " pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.425530 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs4nl\" (UniqueName: \"kubernetes.io/projected/5553aae5-6efa-4d21-bbb7-f2c0f23071b3-kube-api-access-zs4nl\") pod \"ceilometer-0\" (UID: \"5553aae5-6efa-4d21-bbb7-f2c0f23071b3\") " pod="openstack/ceilometer-0" Oct 07 19:21:20 crc kubenswrapper[4825]: I1007 19:21:20.564162 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 19:21:21 crc kubenswrapper[4825]: I1007 19:21:21.110029 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 19:21:21 crc kubenswrapper[4825]: W1007 19:21:21.110785 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5553aae5_6efa_4d21_bbb7_f2c0f23071b3.slice/crio-193246f5d643f4af739053d2dbf148f1b05e6fabcfda7b944b10efb7f2c3a5b7 WatchSource:0}: Error finding container 193246f5d643f4af739053d2dbf148f1b05e6fabcfda7b944b10efb7f2c3a5b7: Status 404 returned error can't find the container with id 193246f5d643f4af739053d2dbf148f1b05e6fabcfda7b944b10efb7f2c3a5b7 Oct 07 19:21:21 crc kubenswrapper[4825]: I1007 19:21:21.820366 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e3f5992-6777-4a65-b0f2-2510b363b9bf" path="/var/lib/kubelet/pods/1e3f5992-6777-4a65-b0f2-2510b363b9bf/volumes" Oct 07 19:21:21 crc kubenswrapper[4825]: I1007 19:21:21.877994 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5553aae5-6efa-4d21-bbb7-f2c0f23071b3","Type":"ContainerStarted","Data":"193246f5d643f4af739053d2dbf148f1b05e6fabcfda7b944b10efb7f2c3a5b7"} Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.445451 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.589544 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5db0052e-c084-496c-bd2a-798eedf8459c-config-data\") pod \"5db0052e-c084-496c-bd2a-798eedf8459c\" (UID: \"5db0052e-c084-496c-bd2a-798eedf8459c\") " Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.589681 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db0052e-c084-496c-bd2a-798eedf8459c-combined-ca-bundle\") pod \"5db0052e-c084-496c-bd2a-798eedf8459c\" (UID: \"5db0052e-c084-496c-bd2a-798eedf8459c\") " Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.589764 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kng26\" (UniqueName: \"kubernetes.io/projected/5db0052e-c084-496c-bd2a-798eedf8459c-kube-api-access-kng26\") pod \"5db0052e-c084-496c-bd2a-798eedf8459c\" (UID: \"5db0052e-c084-496c-bd2a-798eedf8459c\") " Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.589827 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5db0052e-c084-496c-bd2a-798eedf8459c-logs\") pod \"5db0052e-c084-496c-bd2a-798eedf8459c\" (UID: \"5db0052e-c084-496c-bd2a-798eedf8459c\") " Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.591197 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5db0052e-c084-496c-bd2a-798eedf8459c-logs" (OuterVolumeSpecName: "logs") pod "5db0052e-c084-496c-bd2a-798eedf8459c" (UID: "5db0052e-c084-496c-bd2a-798eedf8459c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.599947 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5db0052e-c084-496c-bd2a-798eedf8459c-kube-api-access-kng26" (OuterVolumeSpecName: "kube-api-access-kng26") pod "5db0052e-c084-496c-bd2a-798eedf8459c" (UID: "5db0052e-c084-496c-bd2a-798eedf8459c"). InnerVolumeSpecName "kube-api-access-kng26". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.625563 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db0052e-c084-496c-bd2a-798eedf8459c-config-data" (OuterVolumeSpecName: "config-data") pod "5db0052e-c084-496c-bd2a-798eedf8459c" (UID: "5db0052e-c084-496c-bd2a-798eedf8459c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.632511 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db0052e-c084-496c-bd2a-798eedf8459c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5db0052e-c084-496c-bd2a-798eedf8459c" (UID: "5db0052e-c084-496c-bd2a-798eedf8459c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.692292 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kng26\" (UniqueName: \"kubernetes.io/projected/5db0052e-c084-496c-bd2a-798eedf8459c-kube-api-access-kng26\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.692320 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5db0052e-c084-496c-bd2a-798eedf8459c-logs\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.692331 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5db0052e-c084-496c-bd2a-798eedf8459c-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.692340 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db0052e-c084-496c-bd2a-798eedf8459c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.887457 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5553aae5-6efa-4d21-bbb7-f2c0f23071b3","Type":"ContainerStarted","Data":"c6dbe082ae9683ee34beba02dedb5019949bf3915cf8af80b6d0b27ff5133263"} Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.889771 4825 generic.go:334] "Generic (PLEG): container finished" podID="5db0052e-c084-496c-bd2a-798eedf8459c" containerID="77e022dd4cc7e8fce1559017af5b5395ee2904e4588210338e15545577ffd677" exitCode=0 Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.889811 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5db0052e-c084-496c-bd2a-798eedf8459c","Type":"ContainerDied","Data":"77e022dd4cc7e8fce1559017af5b5395ee2904e4588210338e15545577ffd677"} Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.889832 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5db0052e-c084-496c-bd2a-798eedf8459c","Type":"ContainerDied","Data":"64d5dda7f6094cb3d1a4efd7af6b14fc5916d86ec89447371e89a3a60fa0d476"} Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.889850 4825 scope.go:117] "RemoveContainer" containerID="77e022dd4cc7e8fce1559017af5b5395ee2904e4588210338e15545577ffd677" Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.889960 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.929353 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.934288 4825 scope.go:117] "RemoveContainer" containerID="8a629d79648c71fd5753a6e42d4edd8027cb76ae44823bca4d34a4c9bd3a7174" Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.937736 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.967739 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 19:21:22 crc kubenswrapper[4825]: E1007 19:21:22.968310 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5db0052e-c084-496c-bd2a-798eedf8459c" containerName="nova-api-api" Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.968340 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5db0052e-c084-496c-bd2a-798eedf8459c" containerName="nova-api-api" Oct 07 19:21:22 crc kubenswrapper[4825]: E1007 19:21:22.968362 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5db0052e-c084-496c-bd2a-798eedf8459c" containerName="nova-api-log" Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.968372 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5db0052e-c084-496c-bd2a-798eedf8459c" containerName="nova-api-log" Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.968624 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5db0052e-c084-496c-bd2a-798eedf8459c" containerName="nova-api-api" Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.968661 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5db0052e-c084-496c-bd2a-798eedf8459c" containerName="nova-api-log" Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.969964 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.976694 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.976751 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.976969 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.987336 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.987516 4825 scope.go:117] "RemoveContainer" containerID="77e022dd4cc7e8fce1559017af5b5395ee2904e4588210338e15545577ffd677" Oct 07 19:21:22 crc kubenswrapper[4825]: E1007 19:21:22.988734 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77e022dd4cc7e8fce1559017af5b5395ee2904e4588210338e15545577ffd677\": container with ID starting with 77e022dd4cc7e8fce1559017af5b5395ee2904e4588210338e15545577ffd677 not found: ID does not exist" containerID="77e022dd4cc7e8fce1559017af5b5395ee2904e4588210338e15545577ffd677" Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.989117 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77e022dd4cc7e8fce1559017af5b5395ee2904e4588210338e15545577ffd677"} err="failed to get container status \"77e022dd4cc7e8fce1559017af5b5395ee2904e4588210338e15545577ffd677\": rpc error: code = NotFound desc = could not find container \"77e022dd4cc7e8fce1559017af5b5395ee2904e4588210338e15545577ffd677\": container with ID starting with 77e022dd4cc7e8fce1559017af5b5395ee2904e4588210338e15545577ffd677 not found: ID does not exist" Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.989167 4825 scope.go:117] "RemoveContainer" containerID="8a629d79648c71fd5753a6e42d4edd8027cb76ae44823bca4d34a4c9bd3a7174" Oct 07 19:21:22 crc kubenswrapper[4825]: E1007 19:21:22.989604 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a629d79648c71fd5753a6e42d4edd8027cb76ae44823bca4d34a4c9bd3a7174\": container with ID starting with 8a629d79648c71fd5753a6e42d4edd8027cb76ae44823bca4d34a4c9bd3a7174 not found: ID does not exist" containerID="8a629d79648c71fd5753a6e42d4edd8027cb76ae44823bca4d34a4c9bd3a7174" Oct 07 19:21:22 crc kubenswrapper[4825]: I1007 19:21:22.989635 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a629d79648c71fd5753a6e42d4edd8027cb76ae44823bca4d34a4c9bd3a7174"} err="failed to get container status \"8a629d79648c71fd5753a6e42d4edd8027cb76ae44823bca4d34a4c9bd3a7174\": rpc error: code = NotFound desc = could not find container \"8a629d79648c71fd5753a6e42d4edd8027cb76ae44823bca4d34a4c9bd3a7174\": container with ID starting with 8a629d79648c71fd5753a6e42d4edd8027cb76ae44823bca4d34a4c9bd3a7174 not found: ID does not exist" Oct 07 19:21:23 crc kubenswrapper[4825]: I1007 19:21:23.101081 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/382425a8-c06e-4900-9262-6dec135984da-config-data\") pod \"nova-api-0\" (UID: \"382425a8-c06e-4900-9262-6dec135984da\") " pod="openstack/nova-api-0" Oct 07 19:21:23 crc kubenswrapper[4825]: I1007 19:21:23.101149 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/382425a8-c06e-4900-9262-6dec135984da-internal-tls-certs\") pod \"nova-api-0\" (UID: \"382425a8-c06e-4900-9262-6dec135984da\") " pod="openstack/nova-api-0" Oct 07 19:21:23 crc kubenswrapper[4825]: I1007 19:21:23.101190 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382425a8-c06e-4900-9262-6dec135984da-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"382425a8-c06e-4900-9262-6dec135984da\") " pod="openstack/nova-api-0" Oct 07 19:21:23 crc kubenswrapper[4825]: I1007 19:21:23.101324 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/382425a8-c06e-4900-9262-6dec135984da-logs\") pod \"nova-api-0\" (UID: \"382425a8-c06e-4900-9262-6dec135984da\") " pod="openstack/nova-api-0" Oct 07 19:21:23 crc kubenswrapper[4825]: I1007 19:21:23.101340 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/382425a8-c06e-4900-9262-6dec135984da-public-tls-certs\") pod \"nova-api-0\" (UID: \"382425a8-c06e-4900-9262-6dec135984da\") " pod="openstack/nova-api-0" Oct 07 19:21:23 crc kubenswrapper[4825]: I1007 19:21:23.101354 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2tpf\" (UniqueName: \"kubernetes.io/projected/382425a8-c06e-4900-9262-6dec135984da-kube-api-access-m2tpf\") pod \"nova-api-0\" (UID: \"382425a8-c06e-4900-9262-6dec135984da\") " pod="openstack/nova-api-0" Oct 07 19:21:23 crc kubenswrapper[4825]: I1007 19:21:23.203462 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/382425a8-c06e-4900-9262-6dec135984da-logs\") pod \"nova-api-0\" (UID: \"382425a8-c06e-4900-9262-6dec135984da\") " pod="openstack/nova-api-0" Oct 07 19:21:23 crc kubenswrapper[4825]: I1007 19:21:23.203708 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/382425a8-c06e-4900-9262-6dec135984da-public-tls-certs\") pod \"nova-api-0\" (UID: \"382425a8-c06e-4900-9262-6dec135984da\") " pod="openstack/nova-api-0" Oct 07 19:21:23 crc kubenswrapper[4825]: I1007 19:21:23.203779 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2tpf\" (UniqueName: \"kubernetes.io/projected/382425a8-c06e-4900-9262-6dec135984da-kube-api-access-m2tpf\") pod \"nova-api-0\" (UID: \"382425a8-c06e-4900-9262-6dec135984da\") " pod="openstack/nova-api-0" Oct 07 19:21:23 crc kubenswrapper[4825]: I1007 19:21:23.203870 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/382425a8-c06e-4900-9262-6dec135984da-config-data\") pod \"nova-api-0\" (UID: \"382425a8-c06e-4900-9262-6dec135984da\") " pod="openstack/nova-api-0" Oct 07 19:21:23 crc kubenswrapper[4825]: I1007 19:21:23.203961 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/382425a8-c06e-4900-9262-6dec135984da-internal-tls-certs\") pod \"nova-api-0\" (UID: \"382425a8-c06e-4900-9262-6dec135984da\") " pod="openstack/nova-api-0" Oct 07 19:21:23 crc kubenswrapper[4825]: I1007 19:21:23.204034 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382425a8-c06e-4900-9262-6dec135984da-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"382425a8-c06e-4900-9262-6dec135984da\") " pod="openstack/nova-api-0" Oct 07 19:21:23 crc kubenswrapper[4825]: I1007 19:21:23.204934 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/382425a8-c06e-4900-9262-6dec135984da-logs\") pod \"nova-api-0\" (UID: \"382425a8-c06e-4900-9262-6dec135984da\") " pod="openstack/nova-api-0" Oct 07 19:21:23 crc kubenswrapper[4825]: I1007 19:21:23.209283 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/382425a8-c06e-4900-9262-6dec135984da-public-tls-certs\") pod \"nova-api-0\" (UID: \"382425a8-c06e-4900-9262-6dec135984da\") " pod="openstack/nova-api-0" Oct 07 19:21:23 crc kubenswrapper[4825]: I1007 19:21:23.209349 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382425a8-c06e-4900-9262-6dec135984da-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"382425a8-c06e-4900-9262-6dec135984da\") " pod="openstack/nova-api-0" Oct 07 19:21:23 crc kubenswrapper[4825]: I1007 19:21:23.212293 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/382425a8-c06e-4900-9262-6dec135984da-config-data\") pod \"nova-api-0\" (UID: \"382425a8-c06e-4900-9262-6dec135984da\") " pod="openstack/nova-api-0" Oct 07 19:21:23 crc kubenswrapper[4825]: I1007 19:21:23.213344 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/382425a8-c06e-4900-9262-6dec135984da-internal-tls-certs\") pod \"nova-api-0\" (UID: \"382425a8-c06e-4900-9262-6dec135984da\") " pod="openstack/nova-api-0" Oct 07 19:21:23 crc kubenswrapper[4825]: I1007 19:21:23.238107 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2tpf\" (UniqueName: \"kubernetes.io/projected/382425a8-c06e-4900-9262-6dec135984da-kube-api-access-m2tpf\") pod \"nova-api-0\" (UID: \"382425a8-c06e-4900-9262-6dec135984da\") " pod="openstack/nova-api-0" Oct 07 19:21:23 crc kubenswrapper[4825]: I1007 19:21:23.314080 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 19:21:23 crc kubenswrapper[4825]: I1007 19:21:23.764686 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 19:21:23 crc kubenswrapper[4825]: I1007 19:21:23.806462 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5db0052e-c084-496c-bd2a-798eedf8459c" path="/var/lib/kubelet/pods/5db0052e-c084-496c-bd2a-798eedf8459c/volumes" Oct 07 19:21:23 crc kubenswrapper[4825]: I1007 19:21:23.904023 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"382425a8-c06e-4900-9262-6dec135984da","Type":"ContainerStarted","Data":"bbf0a5d755c608cb356c1965b4da6971c8c33562441ad18524e1859ac37632fc"} Oct 07 19:21:23 crc kubenswrapper[4825]: I1007 19:21:23.908979 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5553aae5-6efa-4d21-bbb7-f2c0f23071b3","Type":"ContainerStarted","Data":"4edb19d0966fb4254858c9a8c9d71035de1fdf26812a383422ab46777e52eb6d"} Oct 07 19:21:23 crc kubenswrapper[4825]: I1007 19:21:23.909021 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5553aae5-6efa-4d21-bbb7-f2c0f23071b3","Type":"ContainerStarted","Data":"6412463440bc7a16a05fb6c76373163ab748379d89135872b91826a74e9680e2"} Oct 07 19:21:24 crc kubenswrapper[4825]: I1007 19:21:24.178274 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:21:24 crc kubenswrapper[4825]: I1007 19:21:24.198582 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:21:24 crc kubenswrapper[4825]: I1007 19:21:24.920986 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"382425a8-c06e-4900-9262-6dec135984da","Type":"ContainerStarted","Data":"2e709d2673e8e0a0fb8eb8805e73717d5ba39b5a0d2a3c82c82ea3c08379a0c9"} Oct 07 19:21:24 crc kubenswrapper[4825]: I1007 19:21:24.921050 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"382425a8-c06e-4900-9262-6dec135984da","Type":"ContainerStarted","Data":"6be542c4f65932cc0f310e7ff8d30ee60101b3b291fde28775a119244742fc7d"} Oct 07 19:21:24 crc kubenswrapper[4825]: I1007 19:21:24.940113 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 07 19:21:24 crc kubenswrapper[4825]: I1007 19:21:24.959375 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.959349535 podStartE2EDuration="2.959349535s" podCreationTimestamp="2025-10-07 19:21:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:21:24.946785212 +0000 UTC m=+1273.768823849" watchObservedRunningTime="2025-10-07 19:21:24.959349535 +0000 UTC m=+1273.781388182" Oct 07 19:21:25 crc kubenswrapper[4825]: I1007 19:21:25.088760 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-jr6f5"] Oct 07 19:21:25 crc kubenswrapper[4825]: I1007 19:21:25.090072 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jr6f5" Oct 07 19:21:25 crc kubenswrapper[4825]: I1007 19:21:25.095397 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 07 19:21:25 crc kubenswrapper[4825]: I1007 19:21:25.095728 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 07 19:21:25 crc kubenswrapper[4825]: I1007 19:21:25.105556 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-jr6f5"] Oct 07 19:21:25 crc kubenswrapper[4825]: I1007 19:21:25.239163 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6a27b82-0f43-4b90-8c59-067a68179b64-config-data\") pod \"nova-cell1-cell-mapping-jr6f5\" (UID: \"d6a27b82-0f43-4b90-8c59-067a68179b64\") " pod="openstack/nova-cell1-cell-mapping-jr6f5" Oct 07 19:21:25 crc kubenswrapper[4825]: I1007 19:21:25.239479 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6a27b82-0f43-4b90-8c59-067a68179b64-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-jr6f5\" (UID: \"d6a27b82-0f43-4b90-8c59-067a68179b64\") " pod="openstack/nova-cell1-cell-mapping-jr6f5" Oct 07 19:21:25 crc kubenswrapper[4825]: I1007 19:21:25.239517 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jhwh\" (UniqueName: \"kubernetes.io/projected/d6a27b82-0f43-4b90-8c59-067a68179b64-kube-api-access-2jhwh\") pod \"nova-cell1-cell-mapping-jr6f5\" (UID: \"d6a27b82-0f43-4b90-8c59-067a68179b64\") " pod="openstack/nova-cell1-cell-mapping-jr6f5" Oct 07 19:21:25 crc kubenswrapper[4825]: I1007 19:21:25.239554 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6a27b82-0f43-4b90-8c59-067a68179b64-scripts\") pod \"nova-cell1-cell-mapping-jr6f5\" (UID: \"d6a27b82-0f43-4b90-8c59-067a68179b64\") " pod="openstack/nova-cell1-cell-mapping-jr6f5" Oct 07 19:21:25 crc kubenswrapper[4825]: I1007 19:21:25.340880 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6a27b82-0f43-4b90-8c59-067a68179b64-config-data\") pod \"nova-cell1-cell-mapping-jr6f5\" (UID: \"d6a27b82-0f43-4b90-8c59-067a68179b64\") " pod="openstack/nova-cell1-cell-mapping-jr6f5" Oct 07 19:21:25 crc kubenswrapper[4825]: I1007 19:21:25.341016 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6a27b82-0f43-4b90-8c59-067a68179b64-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-jr6f5\" (UID: \"d6a27b82-0f43-4b90-8c59-067a68179b64\") " pod="openstack/nova-cell1-cell-mapping-jr6f5" Oct 07 19:21:25 crc kubenswrapper[4825]: I1007 19:21:25.341059 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jhwh\" (UniqueName: \"kubernetes.io/projected/d6a27b82-0f43-4b90-8c59-067a68179b64-kube-api-access-2jhwh\") pod \"nova-cell1-cell-mapping-jr6f5\" (UID: \"d6a27b82-0f43-4b90-8c59-067a68179b64\") " pod="openstack/nova-cell1-cell-mapping-jr6f5" Oct 07 19:21:25 crc kubenswrapper[4825]: I1007 19:21:25.341101 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6a27b82-0f43-4b90-8c59-067a68179b64-scripts\") pod \"nova-cell1-cell-mapping-jr6f5\" (UID: \"d6a27b82-0f43-4b90-8c59-067a68179b64\") " pod="openstack/nova-cell1-cell-mapping-jr6f5" Oct 07 19:21:25 crc kubenswrapper[4825]: I1007 19:21:25.346123 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6a27b82-0f43-4b90-8c59-067a68179b64-scripts\") pod \"nova-cell1-cell-mapping-jr6f5\" (UID: \"d6a27b82-0f43-4b90-8c59-067a68179b64\") " pod="openstack/nova-cell1-cell-mapping-jr6f5" Oct 07 19:21:25 crc kubenswrapper[4825]: I1007 19:21:25.347403 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6a27b82-0f43-4b90-8c59-067a68179b64-config-data\") pod \"nova-cell1-cell-mapping-jr6f5\" (UID: \"d6a27b82-0f43-4b90-8c59-067a68179b64\") " pod="openstack/nova-cell1-cell-mapping-jr6f5" Oct 07 19:21:25 crc kubenswrapper[4825]: I1007 19:21:25.348711 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6a27b82-0f43-4b90-8c59-067a68179b64-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-jr6f5\" (UID: \"d6a27b82-0f43-4b90-8c59-067a68179b64\") " pod="openstack/nova-cell1-cell-mapping-jr6f5" Oct 07 19:21:25 crc kubenswrapper[4825]: I1007 19:21:25.362636 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jhwh\" (UniqueName: \"kubernetes.io/projected/d6a27b82-0f43-4b90-8c59-067a68179b64-kube-api-access-2jhwh\") pod \"nova-cell1-cell-mapping-jr6f5\" (UID: \"d6a27b82-0f43-4b90-8c59-067a68179b64\") " pod="openstack/nova-cell1-cell-mapping-jr6f5" Oct 07 19:21:25 crc kubenswrapper[4825]: I1007 19:21:25.411437 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jr6f5" Oct 07 19:21:25 crc kubenswrapper[4825]: I1007 19:21:25.850097 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-jr6f5"] Oct 07 19:21:25 crc kubenswrapper[4825]: W1007 19:21:25.859001 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6a27b82_0f43_4b90_8c59_067a68179b64.slice/crio-ec7eaf37fb490fd9cf75ee400c0010630af2ff8d3c412d0b8d090a5a73ccdd2c WatchSource:0}: Error finding container ec7eaf37fb490fd9cf75ee400c0010630af2ff8d3c412d0b8d090a5a73ccdd2c: Status 404 returned error can't find the container with id ec7eaf37fb490fd9cf75ee400c0010630af2ff8d3c412d0b8d090a5a73ccdd2c Oct 07 19:21:25 crc kubenswrapper[4825]: I1007 19:21:25.943099 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5553aae5-6efa-4d21-bbb7-f2c0f23071b3","Type":"ContainerStarted","Data":"d2e0fb04e1e1349261d969c83a1b9f82298fcf01d1fdd44745b2cc71e7e23335"} Oct 07 19:21:25 crc kubenswrapper[4825]: I1007 19:21:25.943551 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 19:21:25 crc kubenswrapper[4825]: I1007 19:21:25.949322 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jr6f5" event={"ID":"d6a27b82-0f43-4b90-8c59-067a68179b64","Type":"ContainerStarted","Data":"ec7eaf37fb490fd9cf75ee400c0010630af2ff8d3c412d0b8d090a5a73ccdd2c"} Oct 07 19:21:25 crc kubenswrapper[4825]: I1007 19:21:25.964013 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.866611727 podStartE2EDuration="5.963997639s" podCreationTimestamp="2025-10-07 19:21:20 +0000 UTC" firstStartedPulling="2025-10-07 19:21:21.113372245 +0000 UTC m=+1269.935410882" lastFinishedPulling="2025-10-07 19:21:25.210758157 +0000 UTC m=+1274.032796794" observedRunningTime="2025-10-07 19:21:25.961007015 +0000 UTC m=+1274.783045652" watchObservedRunningTime="2025-10-07 19:21:25.963997639 +0000 UTC m=+1274.786036276" Oct 07 19:21:26 crc kubenswrapper[4825]: I1007 19:21:26.398385 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" Oct 07 19:21:26 crc kubenswrapper[4825]: I1007 19:21:26.479110 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-z2rbb"] Oct 07 19:21:26 crc kubenswrapper[4825]: I1007 19:21:26.479342 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" podUID="4c75272c-4299-451d-8fcf-82204dc97b14" containerName="dnsmasq-dns" containerID="cri-o://42711cfc8a81e917fed2e13fd9024ac1f313c82d8d697309eab32f85869245da" gracePeriod=10 Oct 07 19:21:26 crc kubenswrapper[4825]: I1007 19:21:26.960043 4825 generic.go:334] "Generic (PLEG): container finished" podID="4c75272c-4299-451d-8fcf-82204dc97b14" containerID="42711cfc8a81e917fed2e13fd9024ac1f313c82d8d697309eab32f85869245da" exitCode=0 Oct 07 19:21:26 crc kubenswrapper[4825]: I1007 19:21:26.960218 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" event={"ID":"4c75272c-4299-451d-8fcf-82204dc97b14","Type":"ContainerDied","Data":"42711cfc8a81e917fed2e13fd9024ac1f313c82d8d697309eab32f85869245da"} Oct 07 19:21:26 crc kubenswrapper[4825]: I1007 19:21:26.960633 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" event={"ID":"4c75272c-4299-451d-8fcf-82204dc97b14","Type":"ContainerDied","Data":"323a2ce2aad512d87cfec08db506ce66b25f5384086d09dbda13288f91bba973"} Oct 07 19:21:26 crc kubenswrapper[4825]: I1007 19:21:26.960650 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="323a2ce2aad512d87cfec08db506ce66b25f5384086d09dbda13288f91bba973" Oct 07 19:21:26 crc kubenswrapper[4825]: I1007 19:21:26.962691 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jr6f5" event={"ID":"d6a27b82-0f43-4b90-8c59-067a68179b64","Type":"ContainerStarted","Data":"0600783efcd1976b63b8cddcd172840304f1b75d0058e64c302282e50be96063"} Oct 07 19:21:26 crc kubenswrapper[4825]: I1007 19:21:26.990512 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-jr6f5" podStartSLOduration=1.990491106 podStartE2EDuration="1.990491106s" podCreationTimestamp="2025-10-07 19:21:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:21:26.976754816 +0000 UTC m=+1275.798793453" watchObservedRunningTime="2025-10-07 19:21:26.990491106 +0000 UTC m=+1275.812529743" Oct 07 19:21:27 crc kubenswrapper[4825]: I1007 19:21:27.000089 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" Oct 07 19:21:27 crc kubenswrapper[4825]: I1007 19:21:27.078581 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-ovsdbserver-nb\") pod \"4c75272c-4299-451d-8fcf-82204dc97b14\" (UID: \"4c75272c-4299-451d-8fcf-82204dc97b14\") " Oct 07 19:21:27 crc kubenswrapper[4825]: I1007 19:21:27.078690 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzshn\" (UniqueName: \"kubernetes.io/projected/4c75272c-4299-451d-8fcf-82204dc97b14-kube-api-access-tzshn\") pod \"4c75272c-4299-451d-8fcf-82204dc97b14\" (UID: \"4c75272c-4299-451d-8fcf-82204dc97b14\") " Oct 07 19:21:27 crc kubenswrapper[4825]: I1007 19:21:27.078755 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-dns-swift-storage-0\") pod \"4c75272c-4299-451d-8fcf-82204dc97b14\" (UID: \"4c75272c-4299-451d-8fcf-82204dc97b14\") " Oct 07 19:21:27 crc kubenswrapper[4825]: I1007 19:21:27.078858 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-dns-svc\") pod \"4c75272c-4299-451d-8fcf-82204dc97b14\" (UID: \"4c75272c-4299-451d-8fcf-82204dc97b14\") " Oct 07 19:21:27 crc kubenswrapper[4825]: I1007 19:21:27.078899 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-ovsdbserver-sb\") pod \"4c75272c-4299-451d-8fcf-82204dc97b14\" (UID: \"4c75272c-4299-451d-8fcf-82204dc97b14\") " Oct 07 19:21:27 crc kubenswrapper[4825]: I1007 19:21:27.078991 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-config\") pod \"4c75272c-4299-451d-8fcf-82204dc97b14\" (UID: \"4c75272c-4299-451d-8fcf-82204dc97b14\") " Oct 07 19:21:27 crc kubenswrapper[4825]: I1007 19:21:27.091508 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c75272c-4299-451d-8fcf-82204dc97b14-kube-api-access-tzshn" (OuterVolumeSpecName: "kube-api-access-tzshn") pod "4c75272c-4299-451d-8fcf-82204dc97b14" (UID: "4c75272c-4299-451d-8fcf-82204dc97b14"). InnerVolumeSpecName "kube-api-access-tzshn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:21:27 crc kubenswrapper[4825]: I1007 19:21:27.133984 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4c75272c-4299-451d-8fcf-82204dc97b14" (UID: "4c75272c-4299-451d-8fcf-82204dc97b14"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:21:27 crc kubenswrapper[4825]: I1007 19:21:27.143167 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4c75272c-4299-451d-8fcf-82204dc97b14" (UID: "4c75272c-4299-451d-8fcf-82204dc97b14"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:21:27 crc kubenswrapper[4825]: I1007 19:21:27.152650 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4c75272c-4299-451d-8fcf-82204dc97b14" (UID: "4c75272c-4299-451d-8fcf-82204dc97b14"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:21:27 crc kubenswrapper[4825]: I1007 19:21:27.153929 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-config" (OuterVolumeSpecName: "config") pod "4c75272c-4299-451d-8fcf-82204dc97b14" (UID: "4c75272c-4299-451d-8fcf-82204dc97b14"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:21:27 crc kubenswrapper[4825]: I1007 19:21:27.160819 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4c75272c-4299-451d-8fcf-82204dc97b14" (UID: "4c75272c-4299-451d-8fcf-82204dc97b14"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:21:27 crc kubenswrapper[4825]: I1007 19:21:27.183866 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:27 crc kubenswrapper[4825]: I1007 19:21:27.183902 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:27 crc kubenswrapper[4825]: I1007 19:21:27.183919 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzshn\" (UniqueName: \"kubernetes.io/projected/4c75272c-4299-451d-8fcf-82204dc97b14-kube-api-access-tzshn\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:27 crc kubenswrapper[4825]: I1007 19:21:27.183932 4825 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:27 crc kubenswrapper[4825]: I1007 19:21:27.183945 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:27 crc kubenswrapper[4825]: I1007 19:21:27.183957 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c75272c-4299-451d-8fcf-82204dc97b14-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:27 crc kubenswrapper[4825]: I1007 19:21:27.974773 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-z2rbb" Oct 07 19:21:28 crc kubenswrapper[4825]: I1007 19:21:28.004489 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-z2rbb"] Oct 07 19:21:28 crc kubenswrapper[4825]: I1007 19:21:28.012740 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-z2rbb"] Oct 07 19:21:29 crc kubenswrapper[4825]: I1007 19:21:29.810911 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c75272c-4299-451d-8fcf-82204dc97b14" path="/var/lib/kubelet/pods/4c75272c-4299-451d-8fcf-82204dc97b14/volumes" Oct 07 19:21:31 crc kubenswrapper[4825]: I1007 19:21:31.013127 4825 generic.go:334] "Generic (PLEG): container finished" podID="d6a27b82-0f43-4b90-8c59-067a68179b64" containerID="0600783efcd1976b63b8cddcd172840304f1b75d0058e64c302282e50be96063" exitCode=0 Oct 07 19:21:31 crc kubenswrapper[4825]: I1007 19:21:31.013204 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jr6f5" event={"ID":"d6a27b82-0f43-4b90-8c59-067a68179b64","Type":"ContainerDied","Data":"0600783efcd1976b63b8cddcd172840304f1b75d0058e64c302282e50be96063"} Oct 07 19:21:32 crc kubenswrapper[4825]: I1007 19:21:32.478451 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jr6f5" Oct 07 19:21:32 crc kubenswrapper[4825]: I1007 19:21:32.608779 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6a27b82-0f43-4b90-8c59-067a68179b64-scripts\") pod \"d6a27b82-0f43-4b90-8c59-067a68179b64\" (UID: \"d6a27b82-0f43-4b90-8c59-067a68179b64\") " Oct 07 19:21:32 crc kubenswrapper[4825]: I1007 19:21:32.609125 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jhwh\" (UniqueName: \"kubernetes.io/projected/d6a27b82-0f43-4b90-8c59-067a68179b64-kube-api-access-2jhwh\") pod \"d6a27b82-0f43-4b90-8c59-067a68179b64\" (UID: \"d6a27b82-0f43-4b90-8c59-067a68179b64\") " Oct 07 19:21:32 crc kubenswrapper[4825]: I1007 19:21:32.609270 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6a27b82-0f43-4b90-8c59-067a68179b64-combined-ca-bundle\") pod \"d6a27b82-0f43-4b90-8c59-067a68179b64\" (UID: \"d6a27b82-0f43-4b90-8c59-067a68179b64\") " Oct 07 19:21:32 crc kubenswrapper[4825]: I1007 19:21:32.609348 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6a27b82-0f43-4b90-8c59-067a68179b64-config-data\") pod \"d6a27b82-0f43-4b90-8c59-067a68179b64\" (UID: \"d6a27b82-0f43-4b90-8c59-067a68179b64\") " Oct 07 19:21:32 crc kubenswrapper[4825]: I1007 19:21:32.614378 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6a27b82-0f43-4b90-8c59-067a68179b64-kube-api-access-2jhwh" (OuterVolumeSpecName: "kube-api-access-2jhwh") pod "d6a27b82-0f43-4b90-8c59-067a68179b64" (UID: "d6a27b82-0f43-4b90-8c59-067a68179b64"). InnerVolumeSpecName "kube-api-access-2jhwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:21:32 crc kubenswrapper[4825]: I1007 19:21:32.614837 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6a27b82-0f43-4b90-8c59-067a68179b64-scripts" (OuterVolumeSpecName: "scripts") pod "d6a27b82-0f43-4b90-8c59-067a68179b64" (UID: "d6a27b82-0f43-4b90-8c59-067a68179b64"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:21:32 crc kubenswrapper[4825]: I1007 19:21:32.637973 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6a27b82-0f43-4b90-8c59-067a68179b64-config-data" (OuterVolumeSpecName: "config-data") pod "d6a27b82-0f43-4b90-8c59-067a68179b64" (UID: "d6a27b82-0f43-4b90-8c59-067a68179b64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:21:32 crc kubenswrapper[4825]: I1007 19:21:32.642151 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6a27b82-0f43-4b90-8c59-067a68179b64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6a27b82-0f43-4b90-8c59-067a68179b64" (UID: "d6a27b82-0f43-4b90-8c59-067a68179b64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:21:32 crc kubenswrapper[4825]: I1007 19:21:32.712302 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6a27b82-0f43-4b90-8c59-067a68179b64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:32 crc kubenswrapper[4825]: I1007 19:21:32.712344 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6a27b82-0f43-4b90-8c59-067a68179b64-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:32 crc kubenswrapper[4825]: I1007 19:21:32.712357 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6a27b82-0f43-4b90-8c59-067a68179b64-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:32 crc kubenswrapper[4825]: I1007 19:21:32.712371 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jhwh\" (UniqueName: \"kubernetes.io/projected/d6a27b82-0f43-4b90-8c59-067a68179b64-kube-api-access-2jhwh\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:33 crc kubenswrapper[4825]: I1007 19:21:33.048966 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jr6f5" event={"ID":"d6a27b82-0f43-4b90-8c59-067a68179b64","Type":"ContainerDied","Data":"ec7eaf37fb490fd9cf75ee400c0010630af2ff8d3c412d0b8d090a5a73ccdd2c"} Oct 07 19:21:33 crc kubenswrapper[4825]: I1007 19:21:33.049027 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec7eaf37fb490fd9cf75ee400c0010630af2ff8d3c412d0b8d090a5a73ccdd2c" Oct 07 19:21:33 crc kubenswrapper[4825]: I1007 19:21:33.049058 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jr6f5" Oct 07 19:21:33 crc kubenswrapper[4825]: I1007 19:21:33.232472 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 19:21:33 crc kubenswrapper[4825]: I1007 19:21:33.233042 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="382425a8-c06e-4900-9262-6dec135984da" containerName="nova-api-log" containerID="cri-o://6be542c4f65932cc0f310e7ff8d30ee60101b3b291fde28775a119244742fc7d" gracePeriod=30 Oct 07 19:21:33 crc kubenswrapper[4825]: I1007 19:21:33.233627 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="382425a8-c06e-4900-9262-6dec135984da" containerName="nova-api-api" containerID="cri-o://2e709d2673e8e0a0fb8eb8805e73717d5ba39b5a0d2a3c82c82ea3c08379a0c9" gracePeriod=30 Oct 07 19:21:33 crc kubenswrapper[4825]: I1007 19:21:33.248998 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 19:21:33 crc kubenswrapper[4825]: I1007 19:21:33.249273 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="624e8227-71a0-4baa-883d-5f5f9cab15af" containerName="nova-scheduler-scheduler" containerID="cri-o://d470b8db9b6d817181111638be9f93c24e7ef0d4077b24d541a9ec238db6873f" gracePeriod=30 Oct 07 19:21:33 crc kubenswrapper[4825]: I1007 19:21:33.289914 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 19:21:33 crc kubenswrapper[4825]: I1007 19:21:33.290249 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2881a5f6-3d20-4bbb-a82e-9aafb0076ca7" containerName="nova-metadata-metadata" containerID="cri-o://90a8fe270a6c97ce5f74e190b3c8952e0ff16e8d08a36161cc244e0eaac291e1" gracePeriod=30 Oct 07 19:21:33 crc kubenswrapper[4825]: I1007 19:21:33.291567 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2881a5f6-3d20-4bbb-a82e-9aafb0076ca7" containerName="nova-metadata-log" containerID="cri-o://d815f4609e6e8f8ee9248997b1e4db698381be99aab6d1c13142990c254dddcc" gracePeriod=30 Oct 07 19:21:33 crc kubenswrapper[4825]: I1007 19:21:33.957671 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 19:21:34 crc kubenswrapper[4825]: E1007 19:21:34.028202 4825 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod624e8227_71a0_4baa_883d_5f5f9cab15af.slice/crio-d470b8db9b6d817181111638be9f93c24e7ef0d4077b24d541a9ec238db6873f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod624e8227_71a0_4baa_883d_5f5f9cab15af.slice/crio-conmon-d470b8db9b6d817181111638be9f93c24e7ef0d4077b24d541a9ec238db6873f.scope\": RecentStats: unable to find data in memory cache]" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.053597 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/382425a8-c06e-4900-9262-6dec135984da-logs\") pod \"382425a8-c06e-4900-9262-6dec135984da\" (UID: \"382425a8-c06e-4900-9262-6dec135984da\") " Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.053711 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/382425a8-c06e-4900-9262-6dec135984da-internal-tls-certs\") pod \"382425a8-c06e-4900-9262-6dec135984da\" (UID: \"382425a8-c06e-4900-9262-6dec135984da\") " Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.053758 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382425a8-c06e-4900-9262-6dec135984da-combined-ca-bundle\") pod \"382425a8-c06e-4900-9262-6dec135984da\" (UID: \"382425a8-c06e-4900-9262-6dec135984da\") " Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.053776 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2tpf\" (UniqueName: \"kubernetes.io/projected/382425a8-c06e-4900-9262-6dec135984da-kube-api-access-m2tpf\") pod \"382425a8-c06e-4900-9262-6dec135984da\" (UID: \"382425a8-c06e-4900-9262-6dec135984da\") " Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.053834 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/382425a8-c06e-4900-9262-6dec135984da-config-data\") pod \"382425a8-c06e-4900-9262-6dec135984da\" (UID: \"382425a8-c06e-4900-9262-6dec135984da\") " Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.053860 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/382425a8-c06e-4900-9262-6dec135984da-public-tls-certs\") pod \"382425a8-c06e-4900-9262-6dec135984da\" (UID: \"382425a8-c06e-4900-9262-6dec135984da\") " Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.054365 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/382425a8-c06e-4900-9262-6dec135984da-logs" (OuterVolumeSpecName: "logs") pod "382425a8-c06e-4900-9262-6dec135984da" (UID: "382425a8-c06e-4900-9262-6dec135984da"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.060351 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/382425a8-c06e-4900-9262-6dec135984da-kube-api-access-m2tpf" (OuterVolumeSpecName: "kube-api-access-m2tpf") pod "382425a8-c06e-4900-9262-6dec135984da" (UID: "382425a8-c06e-4900-9262-6dec135984da"). InnerVolumeSpecName "kube-api-access-m2tpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.065682 4825 generic.go:334] "Generic (PLEG): container finished" podID="2881a5f6-3d20-4bbb-a82e-9aafb0076ca7" containerID="d815f4609e6e8f8ee9248997b1e4db698381be99aab6d1c13142990c254dddcc" exitCode=143 Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.065797 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7","Type":"ContainerDied","Data":"d815f4609e6e8f8ee9248997b1e4db698381be99aab6d1c13142990c254dddcc"} Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.079740 4825 generic.go:334] "Generic (PLEG): container finished" podID="624e8227-71a0-4baa-883d-5f5f9cab15af" containerID="d470b8db9b6d817181111638be9f93c24e7ef0d4077b24d541a9ec238db6873f" exitCode=0 Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.079795 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"624e8227-71a0-4baa-883d-5f5f9cab15af","Type":"ContainerDied","Data":"d470b8db9b6d817181111638be9f93c24e7ef0d4077b24d541a9ec238db6873f"} Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.083643 4825 generic.go:334] "Generic (PLEG): container finished" podID="382425a8-c06e-4900-9262-6dec135984da" containerID="2e709d2673e8e0a0fb8eb8805e73717d5ba39b5a0d2a3c82c82ea3c08379a0c9" exitCode=0 Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.083682 4825 generic.go:334] "Generic (PLEG): container finished" podID="382425a8-c06e-4900-9262-6dec135984da" containerID="6be542c4f65932cc0f310e7ff8d30ee60101b3b291fde28775a119244742fc7d" exitCode=143 Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.083709 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"382425a8-c06e-4900-9262-6dec135984da","Type":"ContainerDied","Data":"2e709d2673e8e0a0fb8eb8805e73717d5ba39b5a0d2a3c82c82ea3c08379a0c9"} Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.083741 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"382425a8-c06e-4900-9262-6dec135984da","Type":"ContainerDied","Data":"6be542c4f65932cc0f310e7ff8d30ee60101b3b291fde28775a119244742fc7d"} Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.083753 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"382425a8-c06e-4900-9262-6dec135984da","Type":"ContainerDied","Data":"bbf0a5d755c608cb356c1965b4da6971c8c33562441ad18524e1859ac37632fc"} Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.083774 4825 scope.go:117] "RemoveContainer" containerID="2e709d2673e8e0a0fb8eb8805e73717d5ba39b5a0d2a3c82c82ea3c08379a0c9" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.083940 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.097377 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/382425a8-c06e-4900-9262-6dec135984da-config-data" (OuterVolumeSpecName: "config-data") pod "382425a8-c06e-4900-9262-6dec135984da" (UID: "382425a8-c06e-4900-9262-6dec135984da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.101175 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/382425a8-c06e-4900-9262-6dec135984da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "382425a8-c06e-4900-9262-6dec135984da" (UID: "382425a8-c06e-4900-9262-6dec135984da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.140672 4825 scope.go:117] "RemoveContainer" containerID="6be542c4f65932cc0f310e7ff8d30ee60101b3b291fde28775a119244742fc7d" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.145589 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/382425a8-c06e-4900-9262-6dec135984da-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "382425a8-c06e-4900-9262-6dec135984da" (UID: "382425a8-c06e-4900-9262-6dec135984da"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.151128 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/382425a8-c06e-4900-9262-6dec135984da-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "382425a8-c06e-4900-9262-6dec135984da" (UID: "382425a8-c06e-4900-9262-6dec135984da"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.156699 4825 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/382425a8-c06e-4900-9262-6dec135984da-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.156935 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382425a8-c06e-4900-9262-6dec135984da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.157023 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2tpf\" (UniqueName: \"kubernetes.io/projected/382425a8-c06e-4900-9262-6dec135984da-kube-api-access-m2tpf\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.157138 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/382425a8-c06e-4900-9262-6dec135984da-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.157248 4825 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/382425a8-c06e-4900-9262-6dec135984da-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.157344 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/382425a8-c06e-4900-9262-6dec135984da-logs\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.160421 4825 scope.go:117] "RemoveContainer" containerID="2e709d2673e8e0a0fb8eb8805e73717d5ba39b5a0d2a3c82c82ea3c08379a0c9" Oct 07 19:21:34 crc kubenswrapper[4825]: E1007 19:21:34.161380 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e709d2673e8e0a0fb8eb8805e73717d5ba39b5a0d2a3c82c82ea3c08379a0c9\": container with ID starting with 2e709d2673e8e0a0fb8eb8805e73717d5ba39b5a0d2a3c82c82ea3c08379a0c9 not found: ID does not exist" containerID="2e709d2673e8e0a0fb8eb8805e73717d5ba39b5a0d2a3c82c82ea3c08379a0c9" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.161565 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e709d2673e8e0a0fb8eb8805e73717d5ba39b5a0d2a3c82c82ea3c08379a0c9"} err="failed to get container status \"2e709d2673e8e0a0fb8eb8805e73717d5ba39b5a0d2a3c82c82ea3c08379a0c9\": rpc error: code = NotFound desc = could not find container \"2e709d2673e8e0a0fb8eb8805e73717d5ba39b5a0d2a3c82c82ea3c08379a0c9\": container with ID starting with 2e709d2673e8e0a0fb8eb8805e73717d5ba39b5a0d2a3c82c82ea3c08379a0c9 not found: ID does not exist" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.161711 4825 scope.go:117] "RemoveContainer" containerID="6be542c4f65932cc0f310e7ff8d30ee60101b3b291fde28775a119244742fc7d" Oct 07 19:21:34 crc kubenswrapper[4825]: E1007 19:21:34.162071 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6be542c4f65932cc0f310e7ff8d30ee60101b3b291fde28775a119244742fc7d\": container with ID starting with 6be542c4f65932cc0f310e7ff8d30ee60101b3b291fde28775a119244742fc7d not found: ID does not exist" containerID="6be542c4f65932cc0f310e7ff8d30ee60101b3b291fde28775a119244742fc7d" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.162758 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6be542c4f65932cc0f310e7ff8d30ee60101b3b291fde28775a119244742fc7d"} err="failed to get container status \"6be542c4f65932cc0f310e7ff8d30ee60101b3b291fde28775a119244742fc7d\": rpc error: code = NotFound desc = could not find container \"6be542c4f65932cc0f310e7ff8d30ee60101b3b291fde28775a119244742fc7d\": container with ID starting with 6be542c4f65932cc0f310e7ff8d30ee60101b3b291fde28775a119244742fc7d not found: ID does not exist" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.162886 4825 scope.go:117] "RemoveContainer" containerID="2e709d2673e8e0a0fb8eb8805e73717d5ba39b5a0d2a3c82c82ea3c08379a0c9" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.163536 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e709d2673e8e0a0fb8eb8805e73717d5ba39b5a0d2a3c82c82ea3c08379a0c9"} err="failed to get container status \"2e709d2673e8e0a0fb8eb8805e73717d5ba39b5a0d2a3c82c82ea3c08379a0c9\": rpc error: code = NotFound desc = could not find container \"2e709d2673e8e0a0fb8eb8805e73717d5ba39b5a0d2a3c82c82ea3c08379a0c9\": container with ID starting with 2e709d2673e8e0a0fb8eb8805e73717d5ba39b5a0d2a3c82c82ea3c08379a0c9 not found: ID does not exist" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.163716 4825 scope.go:117] "RemoveContainer" containerID="6be542c4f65932cc0f310e7ff8d30ee60101b3b291fde28775a119244742fc7d" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.164064 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6be542c4f65932cc0f310e7ff8d30ee60101b3b291fde28775a119244742fc7d"} err="failed to get container status \"6be542c4f65932cc0f310e7ff8d30ee60101b3b291fde28775a119244742fc7d\": rpc error: code = NotFound desc = could not find container \"6be542c4f65932cc0f310e7ff8d30ee60101b3b291fde28775a119244742fc7d\": container with ID starting with 6be542c4f65932cc0f310e7ff8d30ee60101b3b291fde28775a119244742fc7d not found: ID does not exist" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.185443 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.258164 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nczqp\" (UniqueName: \"kubernetes.io/projected/624e8227-71a0-4baa-883d-5f5f9cab15af-kube-api-access-nczqp\") pod \"624e8227-71a0-4baa-883d-5f5f9cab15af\" (UID: \"624e8227-71a0-4baa-883d-5f5f9cab15af\") " Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.258647 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624e8227-71a0-4baa-883d-5f5f9cab15af-config-data\") pod \"624e8227-71a0-4baa-883d-5f5f9cab15af\" (UID: \"624e8227-71a0-4baa-883d-5f5f9cab15af\") " Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.258883 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624e8227-71a0-4baa-883d-5f5f9cab15af-combined-ca-bundle\") pod \"624e8227-71a0-4baa-883d-5f5f9cab15af\" (UID: \"624e8227-71a0-4baa-883d-5f5f9cab15af\") " Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.261516 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/624e8227-71a0-4baa-883d-5f5f9cab15af-kube-api-access-nczqp" (OuterVolumeSpecName: "kube-api-access-nczqp") pod "624e8227-71a0-4baa-883d-5f5f9cab15af" (UID: "624e8227-71a0-4baa-883d-5f5f9cab15af"). InnerVolumeSpecName "kube-api-access-nczqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.281920 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624e8227-71a0-4baa-883d-5f5f9cab15af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "624e8227-71a0-4baa-883d-5f5f9cab15af" (UID: "624e8227-71a0-4baa-883d-5f5f9cab15af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.286706 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624e8227-71a0-4baa-883d-5f5f9cab15af-config-data" (OuterVolumeSpecName: "config-data") pod "624e8227-71a0-4baa-883d-5f5f9cab15af" (UID: "624e8227-71a0-4baa-883d-5f5f9cab15af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.361485 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624e8227-71a0-4baa-883d-5f5f9cab15af-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.361530 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624e8227-71a0-4baa-883d-5f5f9cab15af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.361545 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nczqp\" (UniqueName: \"kubernetes.io/projected/624e8227-71a0-4baa-883d-5f5f9cab15af-kube-api-access-nczqp\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.493745 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.501626 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.522028 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 19:21:34 crc kubenswrapper[4825]: E1007 19:21:34.522585 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="382425a8-c06e-4900-9262-6dec135984da" containerName="nova-api-api" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.522611 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="382425a8-c06e-4900-9262-6dec135984da" containerName="nova-api-api" Oct 07 19:21:34 crc kubenswrapper[4825]: E1007 19:21:34.522641 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624e8227-71a0-4baa-883d-5f5f9cab15af" containerName="nova-scheduler-scheduler" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.522649 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="624e8227-71a0-4baa-883d-5f5f9cab15af" containerName="nova-scheduler-scheduler" Oct 07 19:21:34 crc kubenswrapper[4825]: E1007 19:21:34.522658 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c75272c-4299-451d-8fcf-82204dc97b14" containerName="dnsmasq-dns" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.522665 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c75272c-4299-451d-8fcf-82204dc97b14" containerName="dnsmasq-dns" Oct 07 19:21:34 crc kubenswrapper[4825]: E1007 19:21:34.522687 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="382425a8-c06e-4900-9262-6dec135984da" containerName="nova-api-log" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.522694 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="382425a8-c06e-4900-9262-6dec135984da" containerName="nova-api-log" Oct 07 19:21:34 crc kubenswrapper[4825]: E1007 19:21:34.522710 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c75272c-4299-451d-8fcf-82204dc97b14" containerName="init" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.522720 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c75272c-4299-451d-8fcf-82204dc97b14" containerName="init" Oct 07 19:21:34 crc kubenswrapper[4825]: E1007 19:21:34.522727 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a27b82-0f43-4b90-8c59-067a68179b64" containerName="nova-manage" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.522735 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a27b82-0f43-4b90-8c59-067a68179b64" containerName="nova-manage" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.522940 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="382425a8-c06e-4900-9262-6dec135984da" containerName="nova-api-api" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.522964 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="624e8227-71a0-4baa-883d-5f5f9cab15af" containerName="nova-scheduler-scheduler" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.522975 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6a27b82-0f43-4b90-8c59-067a68179b64" containerName="nova-manage" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.522994 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c75272c-4299-451d-8fcf-82204dc97b14" containerName="dnsmasq-dns" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.523005 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="382425a8-c06e-4900-9262-6dec135984da" containerName="nova-api-log" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.524463 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.528854 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.529094 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.530459 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.531494 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.667514 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7f4fc7-89f3-4e32-94fe-f4117c1ca522-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6a7f4fc7-89f3-4e32-94fe-f4117c1ca522\") " pod="openstack/nova-api-0" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.667635 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7f4fc7-89f3-4e32-94fe-f4117c1ca522-config-data\") pod \"nova-api-0\" (UID: \"6a7f4fc7-89f3-4e32-94fe-f4117c1ca522\") " pod="openstack/nova-api-0" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.667713 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a7f4fc7-89f3-4e32-94fe-f4117c1ca522-logs\") pod \"nova-api-0\" (UID: \"6a7f4fc7-89f3-4e32-94fe-f4117c1ca522\") " pod="openstack/nova-api-0" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.667869 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a7f4fc7-89f3-4e32-94fe-f4117c1ca522-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6a7f4fc7-89f3-4e32-94fe-f4117c1ca522\") " pod="openstack/nova-api-0" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.667947 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8nrl\" (UniqueName: \"kubernetes.io/projected/6a7f4fc7-89f3-4e32-94fe-f4117c1ca522-kube-api-access-n8nrl\") pod \"nova-api-0\" (UID: \"6a7f4fc7-89f3-4e32-94fe-f4117c1ca522\") " pod="openstack/nova-api-0" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.668060 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a7f4fc7-89f3-4e32-94fe-f4117c1ca522-public-tls-certs\") pod \"nova-api-0\" (UID: \"6a7f4fc7-89f3-4e32-94fe-f4117c1ca522\") " pod="openstack/nova-api-0" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.769720 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a7f4fc7-89f3-4e32-94fe-f4117c1ca522-public-tls-certs\") pod \"nova-api-0\" (UID: \"6a7f4fc7-89f3-4e32-94fe-f4117c1ca522\") " pod="openstack/nova-api-0" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.769804 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7f4fc7-89f3-4e32-94fe-f4117c1ca522-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6a7f4fc7-89f3-4e32-94fe-f4117c1ca522\") " pod="openstack/nova-api-0" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.769843 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7f4fc7-89f3-4e32-94fe-f4117c1ca522-config-data\") pod \"nova-api-0\" (UID: \"6a7f4fc7-89f3-4e32-94fe-f4117c1ca522\") " pod="openstack/nova-api-0" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.769858 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a7f4fc7-89f3-4e32-94fe-f4117c1ca522-logs\") pod \"nova-api-0\" (UID: \"6a7f4fc7-89f3-4e32-94fe-f4117c1ca522\") " pod="openstack/nova-api-0" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.769905 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a7f4fc7-89f3-4e32-94fe-f4117c1ca522-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6a7f4fc7-89f3-4e32-94fe-f4117c1ca522\") " pod="openstack/nova-api-0" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.769929 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8nrl\" (UniqueName: \"kubernetes.io/projected/6a7f4fc7-89f3-4e32-94fe-f4117c1ca522-kube-api-access-n8nrl\") pod \"nova-api-0\" (UID: \"6a7f4fc7-89f3-4e32-94fe-f4117c1ca522\") " pod="openstack/nova-api-0" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.771211 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a7f4fc7-89f3-4e32-94fe-f4117c1ca522-logs\") pod \"nova-api-0\" (UID: \"6a7f4fc7-89f3-4e32-94fe-f4117c1ca522\") " pod="openstack/nova-api-0" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.773968 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7f4fc7-89f3-4e32-94fe-f4117c1ca522-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6a7f4fc7-89f3-4e32-94fe-f4117c1ca522\") " pod="openstack/nova-api-0" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.774845 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a7f4fc7-89f3-4e32-94fe-f4117c1ca522-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6a7f4fc7-89f3-4e32-94fe-f4117c1ca522\") " pod="openstack/nova-api-0" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.775533 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a7f4fc7-89f3-4e32-94fe-f4117c1ca522-public-tls-certs\") pod \"nova-api-0\" (UID: \"6a7f4fc7-89f3-4e32-94fe-f4117c1ca522\") " pod="openstack/nova-api-0" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.777966 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7f4fc7-89f3-4e32-94fe-f4117c1ca522-config-data\") pod \"nova-api-0\" (UID: \"6a7f4fc7-89f3-4e32-94fe-f4117c1ca522\") " pod="openstack/nova-api-0" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.787082 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8nrl\" (UniqueName: \"kubernetes.io/projected/6a7f4fc7-89f3-4e32-94fe-f4117c1ca522-kube-api-access-n8nrl\") pod \"nova-api-0\" (UID: \"6a7f4fc7-89f3-4e32-94fe-f4117c1ca522\") " pod="openstack/nova-api-0" Oct 07 19:21:34 crc kubenswrapper[4825]: I1007 19:21:34.839649 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 19:21:35 crc kubenswrapper[4825]: I1007 19:21:35.098860 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"624e8227-71a0-4baa-883d-5f5f9cab15af","Type":"ContainerDied","Data":"d5b71a5e05a967f8842cd6b63720f2202ea605f55369653512513d1026412889"} Oct 07 19:21:35 crc kubenswrapper[4825]: I1007 19:21:35.098905 4825 scope.go:117] "RemoveContainer" containerID="d470b8db9b6d817181111638be9f93c24e7ef0d4077b24d541a9ec238db6873f" Oct 07 19:21:35 crc kubenswrapper[4825]: I1007 19:21:35.098930 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 19:21:35 crc kubenswrapper[4825]: I1007 19:21:35.140632 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 19:21:35 crc kubenswrapper[4825]: I1007 19:21:35.157584 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 19:21:35 crc kubenswrapper[4825]: I1007 19:21:35.167932 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 19:21:35 crc kubenswrapper[4825]: I1007 19:21:35.169613 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 19:21:35 crc kubenswrapper[4825]: I1007 19:21:35.171600 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 19:21:35 crc kubenswrapper[4825]: I1007 19:21:35.185664 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 19:21:35 crc kubenswrapper[4825]: I1007 19:21:35.280571 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146610c1-1e58-4a52-ba58-b190f53f4a03-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"146610c1-1e58-4a52-ba58-b190f53f4a03\") " pod="openstack/nova-scheduler-0" Oct 07 19:21:35 crc kubenswrapper[4825]: I1007 19:21:35.280670 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wcld\" (UniqueName: \"kubernetes.io/projected/146610c1-1e58-4a52-ba58-b190f53f4a03-kube-api-access-5wcld\") pod \"nova-scheduler-0\" (UID: \"146610c1-1e58-4a52-ba58-b190f53f4a03\") " pod="openstack/nova-scheduler-0" Oct 07 19:21:35 crc kubenswrapper[4825]: I1007 19:21:35.280762 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/146610c1-1e58-4a52-ba58-b190f53f4a03-config-data\") pod \"nova-scheduler-0\" (UID: \"146610c1-1e58-4a52-ba58-b190f53f4a03\") " pod="openstack/nova-scheduler-0" Oct 07 19:21:35 crc kubenswrapper[4825]: I1007 19:21:35.303515 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 19:21:35 crc kubenswrapper[4825]: I1007 19:21:35.383387 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/146610c1-1e58-4a52-ba58-b190f53f4a03-config-data\") pod \"nova-scheduler-0\" (UID: \"146610c1-1e58-4a52-ba58-b190f53f4a03\") " pod="openstack/nova-scheduler-0" Oct 07 19:21:35 crc kubenswrapper[4825]: I1007 19:21:35.383663 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146610c1-1e58-4a52-ba58-b190f53f4a03-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"146610c1-1e58-4a52-ba58-b190f53f4a03\") " pod="openstack/nova-scheduler-0" Oct 07 19:21:35 crc kubenswrapper[4825]: I1007 19:21:35.383759 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wcld\" (UniqueName: \"kubernetes.io/projected/146610c1-1e58-4a52-ba58-b190f53f4a03-kube-api-access-5wcld\") pod \"nova-scheduler-0\" (UID: \"146610c1-1e58-4a52-ba58-b190f53f4a03\") " pod="openstack/nova-scheduler-0" Oct 07 19:21:35 crc kubenswrapper[4825]: I1007 19:21:35.389896 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146610c1-1e58-4a52-ba58-b190f53f4a03-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"146610c1-1e58-4a52-ba58-b190f53f4a03\") " pod="openstack/nova-scheduler-0" Oct 07 19:21:35 crc kubenswrapper[4825]: I1007 19:21:35.390040 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/146610c1-1e58-4a52-ba58-b190f53f4a03-config-data\") pod \"nova-scheduler-0\" (UID: \"146610c1-1e58-4a52-ba58-b190f53f4a03\") " pod="openstack/nova-scheduler-0" Oct 07 19:21:35 crc kubenswrapper[4825]: I1007 19:21:35.413160 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wcld\" (UniqueName: \"kubernetes.io/projected/146610c1-1e58-4a52-ba58-b190f53f4a03-kube-api-access-5wcld\") pod \"nova-scheduler-0\" (UID: \"146610c1-1e58-4a52-ba58-b190f53f4a03\") " pod="openstack/nova-scheduler-0" Oct 07 19:21:35 crc kubenswrapper[4825]: I1007 19:21:35.498941 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 19:21:35 crc kubenswrapper[4825]: I1007 19:21:35.811646 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="382425a8-c06e-4900-9262-6dec135984da" path="/var/lib/kubelet/pods/382425a8-c06e-4900-9262-6dec135984da/volumes" Oct 07 19:21:35 crc kubenswrapper[4825]: I1007 19:21:35.812979 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="624e8227-71a0-4baa-883d-5f5f9cab15af" path="/var/lib/kubelet/pods/624e8227-71a0-4baa-883d-5f5f9cab15af/volumes" Oct 07 19:21:35 crc kubenswrapper[4825]: I1007 19:21:35.975388 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 19:21:35 crc kubenswrapper[4825]: W1007 19:21:35.976479 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod146610c1_1e58_4a52_ba58_b190f53f4a03.slice/crio-4c0671d4033a979a5cc4bcb739cabd0be6cb40c6673496df4a53680909e807ae WatchSource:0}: Error finding container 4c0671d4033a979a5cc4bcb739cabd0be6cb40c6673496df4a53680909e807ae: Status 404 returned error can't find the container with id 4c0671d4033a979a5cc4bcb739cabd0be6cb40c6673496df4a53680909e807ae Oct 07 19:21:36 crc kubenswrapper[4825]: I1007 19:21:36.119072 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a7f4fc7-89f3-4e32-94fe-f4117c1ca522","Type":"ContainerStarted","Data":"a20f669d741d119d6d1a3a946f28eaf837e44d9ac524b3253ae82ac4ad682e4f"} Oct 07 19:21:36 crc kubenswrapper[4825]: I1007 19:21:36.119141 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a7f4fc7-89f3-4e32-94fe-f4117c1ca522","Type":"ContainerStarted","Data":"cf48e9ad7464126bb634cb6a9e4dcf1557810805203fd974cdde3bc050ee2b67"} Oct 07 19:21:36 crc kubenswrapper[4825]: I1007 19:21:36.119165 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a7f4fc7-89f3-4e32-94fe-f4117c1ca522","Type":"ContainerStarted","Data":"df0f4023f06e8ae4845b22b8b47382bdaae67773f736eab47b49b75166c5f692"} Oct 07 19:21:36 crc kubenswrapper[4825]: I1007 19:21:36.121840 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"146610c1-1e58-4a52-ba58-b190f53f4a03","Type":"ContainerStarted","Data":"4c0671d4033a979a5cc4bcb739cabd0be6cb40c6673496df4a53680909e807ae"} Oct 07 19:21:36 crc kubenswrapper[4825]: I1007 19:21:36.147733 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.147707688 podStartE2EDuration="2.147707688s" podCreationTimestamp="2025-10-07 19:21:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:21:36.142314779 +0000 UTC m=+1284.964353416" watchObservedRunningTime="2025-10-07 19:21:36.147707688 +0000 UTC m=+1284.969746365" Oct 07 19:21:36 crc kubenswrapper[4825]: I1007 19:21:36.440359 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="2881a5f6-3d20-4bbb-a82e-9aafb0076ca7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": read tcp 10.217.0.2:42746->10.217.0.196:8775: read: connection reset by peer" Oct 07 19:21:36 crc kubenswrapper[4825]: I1007 19:21:36.440483 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="2881a5f6-3d20-4bbb-a82e-9aafb0076ca7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": read tcp 10.217.0.2:42762->10.217.0.196:8775: read: connection reset by peer" Oct 07 19:21:36 crc kubenswrapper[4825]: I1007 19:21:36.916624 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.025688 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-config-data\") pod \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\" (UID: \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\") " Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.026134 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md5n5\" (UniqueName: \"kubernetes.io/projected/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-kube-api-access-md5n5\") pod \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\" (UID: \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\") " Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.026212 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-combined-ca-bundle\") pod \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\" (UID: \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\") " Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.026266 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-logs\") pod \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\" (UID: \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\") " Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.026402 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-nova-metadata-tls-certs\") pod \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\" (UID: \"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7\") " Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.027790 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-logs" (OuterVolumeSpecName: "logs") pod "2881a5f6-3d20-4bbb-a82e-9aafb0076ca7" (UID: "2881a5f6-3d20-4bbb-a82e-9aafb0076ca7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.035523 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-kube-api-access-md5n5" (OuterVolumeSpecName: "kube-api-access-md5n5") pod "2881a5f6-3d20-4bbb-a82e-9aafb0076ca7" (UID: "2881a5f6-3d20-4bbb-a82e-9aafb0076ca7"). InnerVolumeSpecName "kube-api-access-md5n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.063628 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-config-data" (OuterVolumeSpecName: "config-data") pod "2881a5f6-3d20-4bbb-a82e-9aafb0076ca7" (UID: "2881a5f6-3d20-4bbb-a82e-9aafb0076ca7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.067415 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2881a5f6-3d20-4bbb-a82e-9aafb0076ca7" (UID: "2881a5f6-3d20-4bbb-a82e-9aafb0076ca7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.102856 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2881a5f6-3d20-4bbb-a82e-9aafb0076ca7" (UID: "2881a5f6-3d20-4bbb-a82e-9aafb0076ca7"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.128013 4825 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.128042 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.128051 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md5n5\" (UniqueName: \"kubernetes.io/projected/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-kube-api-access-md5n5\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.128060 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.128069 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7-logs\") on node \"crc\" DevicePath \"\"" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.150185 4825 generic.go:334] "Generic (PLEG): container finished" podID="2881a5f6-3d20-4bbb-a82e-9aafb0076ca7" containerID="90a8fe270a6c97ce5f74e190b3c8952e0ff16e8d08a36161cc244e0eaac291e1" exitCode=0 Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.150276 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.150285 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7","Type":"ContainerDied","Data":"90a8fe270a6c97ce5f74e190b3c8952e0ff16e8d08a36161cc244e0eaac291e1"} Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.150370 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2881a5f6-3d20-4bbb-a82e-9aafb0076ca7","Type":"ContainerDied","Data":"3a8046dd381ce5afe7984f2e27565726032c55ce6b0b8a1b0983051a17cc4039"} Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.150392 4825 scope.go:117] "RemoveContainer" containerID="90a8fe270a6c97ce5f74e190b3c8952e0ff16e8d08a36161cc244e0eaac291e1" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.153695 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"146610c1-1e58-4a52-ba58-b190f53f4a03","Type":"ContainerStarted","Data":"ed764aec58f89eb6c03f213f99ff934ba63e67b4b985b3a45001555c7216c2f1"} Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.168597 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.16858156 podStartE2EDuration="2.16858156s" podCreationTimestamp="2025-10-07 19:21:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:21:37.166462473 +0000 UTC m=+1285.988501100" watchObservedRunningTime="2025-10-07 19:21:37.16858156 +0000 UTC m=+1285.990620197" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.191912 4825 scope.go:117] "RemoveContainer" containerID="d815f4609e6e8f8ee9248997b1e4db698381be99aab6d1c13142990c254dddcc" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.197441 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.212728 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.214349 4825 scope.go:117] "RemoveContainer" containerID="90a8fe270a6c97ce5f74e190b3c8952e0ff16e8d08a36161cc244e0eaac291e1" Oct 07 19:21:37 crc kubenswrapper[4825]: E1007 19:21:37.214790 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90a8fe270a6c97ce5f74e190b3c8952e0ff16e8d08a36161cc244e0eaac291e1\": container with ID starting with 90a8fe270a6c97ce5f74e190b3c8952e0ff16e8d08a36161cc244e0eaac291e1 not found: ID does not exist" containerID="90a8fe270a6c97ce5f74e190b3c8952e0ff16e8d08a36161cc244e0eaac291e1" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.214834 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90a8fe270a6c97ce5f74e190b3c8952e0ff16e8d08a36161cc244e0eaac291e1"} err="failed to get container status \"90a8fe270a6c97ce5f74e190b3c8952e0ff16e8d08a36161cc244e0eaac291e1\": rpc error: code = NotFound desc = could not find container \"90a8fe270a6c97ce5f74e190b3c8952e0ff16e8d08a36161cc244e0eaac291e1\": container with ID starting with 90a8fe270a6c97ce5f74e190b3c8952e0ff16e8d08a36161cc244e0eaac291e1 not found: ID does not exist" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.214859 4825 scope.go:117] "RemoveContainer" containerID="d815f4609e6e8f8ee9248997b1e4db698381be99aab6d1c13142990c254dddcc" Oct 07 19:21:37 crc kubenswrapper[4825]: E1007 19:21:37.215127 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d815f4609e6e8f8ee9248997b1e4db698381be99aab6d1c13142990c254dddcc\": container with ID starting with d815f4609e6e8f8ee9248997b1e4db698381be99aab6d1c13142990c254dddcc not found: ID does not exist" containerID="d815f4609e6e8f8ee9248997b1e4db698381be99aab6d1c13142990c254dddcc" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.215153 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d815f4609e6e8f8ee9248997b1e4db698381be99aab6d1c13142990c254dddcc"} err="failed to get container status \"d815f4609e6e8f8ee9248997b1e4db698381be99aab6d1c13142990c254dddcc\": rpc error: code = NotFound desc = could not find container \"d815f4609e6e8f8ee9248997b1e4db698381be99aab6d1c13142990c254dddcc\": container with ID starting with d815f4609e6e8f8ee9248997b1e4db698381be99aab6d1c13142990c254dddcc not found: ID does not exist" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.222352 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 19:21:37 crc kubenswrapper[4825]: E1007 19:21:37.222760 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2881a5f6-3d20-4bbb-a82e-9aafb0076ca7" containerName="nova-metadata-log" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.222776 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="2881a5f6-3d20-4bbb-a82e-9aafb0076ca7" containerName="nova-metadata-log" Oct 07 19:21:37 crc kubenswrapper[4825]: E1007 19:21:37.222797 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2881a5f6-3d20-4bbb-a82e-9aafb0076ca7" containerName="nova-metadata-metadata" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.222803 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="2881a5f6-3d20-4bbb-a82e-9aafb0076ca7" containerName="nova-metadata-metadata" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.222991 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="2881a5f6-3d20-4bbb-a82e-9aafb0076ca7" containerName="nova-metadata-metadata" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.223006 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="2881a5f6-3d20-4bbb-a82e-9aafb0076ca7" containerName="nova-metadata-log" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.224048 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.226338 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.226485 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.244515 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.331558 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4c4cfbe-20c8-402c-90b0-040fbbb0d58e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c4c4cfbe-20c8-402c-90b0-040fbbb0d58e\") " pod="openstack/nova-metadata-0" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.331683 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4c4cfbe-20c8-402c-90b0-040fbbb0d58e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c4c4cfbe-20c8-402c-90b0-040fbbb0d58e\") " pod="openstack/nova-metadata-0" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.332044 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4c4cfbe-20c8-402c-90b0-040fbbb0d58e-logs\") pod \"nova-metadata-0\" (UID: \"c4c4cfbe-20c8-402c-90b0-040fbbb0d58e\") " pod="openstack/nova-metadata-0" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.332175 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4c4cfbe-20c8-402c-90b0-040fbbb0d58e-config-data\") pod \"nova-metadata-0\" (UID: \"c4c4cfbe-20c8-402c-90b0-040fbbb0d58e\") " pod="openstack/nova-metadata-0" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.332219 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4w8z\" (UniqueName: \"kubernetes.io/projected/c4c4cfbe-20c8-402c-90b0-040fbbb0d58e-kube-api-access-f4w8z\") pod \"nova-metadata-0\" (UID: \"c4c4cfbe-20c8-402c-90b0-040fbbb0d58e\") " pod="openstack/nova-metadata-0" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.434146 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4c4cfbe-20c8-402c-90b0-040fbbb0d58e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c4c4cfbe-20c8-402c-90b0-040fbbb0d58e\") " pod="openstack/nova-metadata-0" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.434739 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4c4cfbe-20c8-402c-90b0-040fbbb0d58e-logs\") pod \"nova-metadata-0\" (UID: \"c4c4cfbe-20c8-402c-90b0-040fbbb0d58e\") " pod="openstack/nova-metadata-0" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.434815 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4c4cfbe-20c8-402c-90b0-040fbbb0d58e-config-data\") pod \"nova-metadata-0\" (UID: \"c4c4cfbe-20c8-402c-90b0-040fbbb0d58e\") " pod="openstack/nova-metadata-0" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.434859 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4w8z\" (UniqueName: \"kubernetes.io/projected/c4c4cfbe-20c8-402c-90b0-040fbbb0d58e-kube-api-access-f4w8z\") pod \"nova-metadata-0\" (UID: \"c4c4cfbe-20c8-402c-90b0-040fbbb0d58e\") " pod="openstack/nova-metadata-0" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.434915 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4c4cfbe-20c8-402c-90b0-040fbbb0d58e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c4c4cfbe-20c8-402c-90b0-040fbbb0d58e\") " pod="openstack/nova-metadata-0" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.435298 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4c4cfbe-20c8-402c-90b0-040fbbb0d58e-logs\") pod \"nova-metadata-0\" (UID: \"c4c4cfbe-20c8-402c-90b0-040fbbb0d58e\") " pod="openstack/nova-metadata-0" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.437909 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4c4cfbe-20c8-402c-90b0-040fbbb0d58e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c4c4cfbe-20c8-402c-90b0-040fbbb0d58e\") " pod="openstack/nova-metadata-0" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.438421 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4c4cfbe-20c8-402c-90b0-040fbbb0d58e-config-data\") pod \"nova-metadata-0\" (UID: \"c4c4cfbe-20c8-402c-90b0-040fbbb0d58e\") " pod="openstack/nova-metadata-0" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.438450 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4c4cfbe-20c8-402c-90b0-040fbbb0d58e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c4c4cfbe-20c8-402c-90b0-040fbbb0d58e\") " pod="openstack/nova-metadata-0" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.457742 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4w8z\" (UniqueName: \"kubernetes.io/projected/c4c4cfbe-20c8-402c-90b0-040fbbb0d58e-kube-api-access-f4w8z\") pod \"nova-metadata-0\" (UID: \"c4c4cfbe-20c8-402c-90b0-040fbbb0d58e\") " pod="openstack/nova-metadata-0" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.545998 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 19:21:37 crc kubenswrapper[4825]: I1007 19:21:37.809345 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2881a5f6-3d20-4bbb-a82e-9aafb0076ca7" path="/var/lib/kubelet/pods/2881a5f6-3d20-4bbb-a82e-9aafb0076ca7/volumes" Oct 07 19:21:38 crc kubenswrapper[4825]: W1007 19:21:38.009898 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4c4cfbe_20c8_402c_90b0_040fbbb0d58e.slice/crio-89908966b6dfe9d2fccbda2cef7a281d8548d481aab99d67b3fad8f560206063 WatchSource:0}: Error finding container 89908966b6dfe9d2fccbda2cef7a281d8548d481aab99d67b3fad8f560206063: Status 404 returned error can't find the container with id 89908966b6dfe9d2fccbda2cef7a281d8548d481aab99d67b3fad8f560206063 Oct 07 19:21:38 crc kubenswrapper[4825]: I1007 19:21:38.013012 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 19:21:38 crc kubenswrapper[4825]: I1007 19:21:38.163505 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c4c4cfbe-20c8-402c-90b0-040fbbb0d58e","Type":"ContainerStarted","Data":"89908966b6dfe9d2fccbda2cef7a281d8548d481aab99d67b3fad8f560206063"} Oct 07 19:21:39 crc kubenswrapper[4825]: I1007 19:21:39.173107 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c4c4cfbe-20c8-402c-90b0-040fbbb0d58e","Type":"ContainerStarted","Data":"8e143e2fff2f870446570be9c471083612576deea5a0c733b1c00e6716946ed6"} Oct 07 19:21:39 crc kubenswrapper[4825]: I1007 19:21:39.173404 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c4c4cfbe-20c8-402c-90b0-040fbbb0d58e","Type":"ContainerStarted","Data":"7cb689f7062434917d29876e6f28bfa0d0c6b0a2c5440a6c32d713dba67bbf8b"} Oct 07 19:21:39 crc kubenswrapper[4825]: I1007 19:21:39.201026 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.20100581 podStartE2EDuration="2.20100581s" podCreationTimestamp="2025-10-07 19:21:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:21:39.193914588 +0000 UTC m=+1288.015953275" watchObservedRunningTime="2025-10-07 19:21:39.20100581 +0000 UTC m=+1288.023044437" Oct 07 19:21:40 crc kubenswrapper[4825]: I1007 19:21:40.500203 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 19:21:42 crc kubenswrapper[4825]: I1007 19:21:42.546797 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 19:21:42 crc kubenswrapper[4825]: I1007 19:21:42.547151 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 19:21:44 crc kubenswrapper[4825]: I1007 19:21:44.839918 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 19:21:44 crc kubenswrapper[4825]: I1007 19:21:44.840781 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 19:21:45 crc kubenswrapper[4825]: I1007 19:21:45.500141 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 07 19:21:45 crc kubenswrapper[4825]: I1007 19:21:45.548850 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 07 19:21:45 crc kubenswrapper[4825]: I1007 19:21:45.857437 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6a7f4fc7-89f3-4e32-94fe-f4117c1ca522" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 19:21:45 crc kubenswrapper[4825]: I1007 19:21:45.857462 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6a7f4fc7-89f3-4e32-94fe-f4117c1ca522" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 19:21:46 crc kubenswrapper[4825]: I1007 19:21:46.302948 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 07 19:21:47 crc kubenswrapper[4825]: I1007 19:21:47.547846 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 19:21:47 crc kubenswrapper[4825]: I1007 19:21:47.548187 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 19:21:48 crc kubenswrapper[4825]: I1007 19:21:48.565522 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c4c4cfbe-20c8-402c-90b0-040fbbb0d58e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 19:21:48 crc kubenswrapper[4825]: I1007 19:21:48.565556 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c4c4cfbe-20c8-402c-90b0-040fbbb0d58e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 19:21:50 crc kubenswrapper[4825]: I1007 19:21:50.575996 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 19:21:54 crc kubenswrapper[4825]: I1007 19:21:54.851284 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 19:21:54 crc kubenswrapper[4825]: I1007 19:21:54.852380 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 19:21:54 crc kubenswrapper[4825]: I1007 19:21:54.853249 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 19:21:54 crc kubenswrapper[4825]: I1007 19:21:54.865622 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 19:21:55 crc kubenswrapper[4825]: I1007 19:21:55.351913 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 19:21:55 crc kubenswrapper[4825]: I1007 19:21:55.359943 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 19:21:57 crc kubenswrapper[4825]: I1007 19:21:57.553721 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 19:21:57 crc kubenswrapper[4825]: I1007 19:21:57.558897 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 19:21:57 crc kubenswrapper[4825]: I1007 19:21:57.561746 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 19:21:58 crc kubenswrapper[4825]: I1007 19:21:58.397342 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 19:22:06 crc kubenswrapper[4825]: I1007 19:22:06.189022 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 19:22:07 crc kubenswrapper[4825]: I1007 19:22:07.085961 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 19:22:10 crc kubenswrapper[4825]: I1007 19:22:10.589462 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="19bd5f67-ab1b-4816-8e44-f792ea626299" containerName="rabbitmq" containerID="cri-o://b578eafcd8ed7708e8777b91b6e61504238a46aa50b111e1dd945c5896b307c7" gracePeriod=604796 Oct 07 19:22:11 crc kubenswrapper[4825]: I1007 19:22:11.208654 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="f43a8cb5-b546-476e-a429-12947216e9b0" containerName="rabbitmq" containerID="cri-o://6c136e84a59f4391d0345e16fc9393797c5051f0fd6778063acfdfa39b8ede8c" gracePeriod=604796 Oct 07 19:22:16 crc kubenswrapper[4825]: I1007 19:22:16.922831 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="19bd5f67-ab1b-4816-8e44-f792ea626299" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.216971 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.229859 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="f43a8cb5-b546-476e-a429-12947216e9b0" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.362987 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19bd5f67-ab1b-4816-8e44-f792ea626299-rabbitmq-erlang-cookie\") pod \"19bd5f67-ab1b-4816-8e44-f792ea626299\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.363034 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/19bd5f67-ab1b-4816-8e44-f792ea626299-server-conf\") pod \"19bd5f67-ab1b-4816-8e44-f792ea626299\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.363054 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"19bd5f67-ab1b-4816-8e44-f792ea626299\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.363082 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19bd5f67-ab1b-4816-8e44-f792ea626299-rabbitmq-confd\") pod \"19bd5f67-ab1b-4816-8e44-f792ea626299\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.363203 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19bd5f67-ab1b-4816-8e44-f792ea626299-plugins-conf\") pod \"19bd5f67-ab1b-4816-8e44-f792ea626299\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.363320 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s4gz\" (UniqueName: \"kubernetes.io/projected/19bd5f67-ab1b-4816-8e44-f792ea626299-kube-api-access-7s4gz\") pod \"19bd5f67-ab1b-4816-8e44-f792ea626299\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.363358 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19bd5f67-ab1b-4816-8e44-f792ea626299-rabbitmq-plugins\") pod \"19bd5f67-ab1b-4816-8e44-f792ea626299\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.363391 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19bd5f67-ab1b-4816-8e44-f792ea626299-config-data\") pod \"19bd5f67-ab1b-4816-8e44-f792ea626299\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.363414 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19bd5f67-ab1b-4816-8e44-f792ea626299-pod-info\") pod \"19bd5f67-ab1b-4816-8e44-f792ea626299\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.363451 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19bd5f67-ab1b-4816-8e44-f792ea626299-erlang-cookie-secret\") pod \"19bd5f67-ab1b-4816-8e44-f792ea626299\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.363529 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/19bd5f67-ab1b-4816-8e44-f792ea626299-rabbitmq-tls\") pod \"19bd5f67-ab1b-4816-8e44-f792ea626299\" (UID: \"19bd5f67-ab1b-4816-8e44-f792ea626299\") " Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.363650 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19bd5f67-ab1b-4816-8e44-f792ea626299-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "19bd5f67-ab1b-4816-8e44-f792ea626299" (UID: "19bd5f67-ab1b-4816-8e44-f792ea626299"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.363978 4825 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19bd5f67-ab1b-4816-8e44-f792ea626299-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.364273 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19bd5f67-ab1b-4816-8e44-f792ea626299-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "19bd5f67-ab1b-4816-8e44-f792ea626299" (UID: "19bd5f67-ab1b-4816-8e44-f792ea626299"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.365357 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19bd5f67-ab1b-4816-8e44-f792ea626299-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "19bd5f67-ab1b-4816-8e44-f792ea626299" (UID: "19bd5f67-ab1b-4816-8e44-f792ea626299"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.371432 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "19bd5f67-ab1b-4816-8e44-f792ea626299" (UID: "19bd5f67-ab1b-4816-8e44-f792ea626299"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.380197 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19bd5f67-ab1b-4816-8e44-f792ea626299-kube-api-access-7s4gz" (OuterVolumeSpecName: "kube-api-access-7s4gz") pod "19bd5f67-ab1b-4816-8e44-f792ea626299" (UID: "19bd5f67-ab1b-4816-8e44-f792ea626299"). InnerVolumeSpecName "kube-api-access-7s4gz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.382328 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19bd5f67-ab1b-4816-8e44-f792ea626299-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "19bd5f67-ab1b-4816-8e44-f792ea626299" (UID: "19bd5f67-ab1b-4816-8e44-f792ea626299"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.393939 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19bd5f67-ab1b-4816-8e44-f792ea626299-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "19bd5f67-ab1b-4816-8e44-f792ea626299" (UID: "19bd5f67-ab1b-4816-8e44-f792ea626299"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.394384 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/19bd5f67-ab1b-4816-8e44-f792ea626299-pod-info" (OuterVolumeSpecName: "pod-info") pod "19bd5f67-ab1b-4816-8e44-f792ea626299" (UID: "19bd5f67-ab1b-4816-8e44-f792ea626299"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.414742 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19bd5f67-ab1b-4816-8e44-f792ea626299-config-data" (OuterVolumeSpecName: "config-data") pod "19bd5f67-ab1b-4816-8e44-f792ea626299" (UID: "19bd5f67-ab1b-4816-8e44-f792ea626299"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.435209 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19bd5f67-ab1b-4816-8e44-f792ea626299-server-conf" (OuterVolumeSpecName: "server-conf") pod "19bd5f67-ab1b-4816-8e44-f792ea626299" (UID: "19bd5f67-ab1b-4816-8e44-f792ea626299"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.465526 4825 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/19bd5f67-ab1b-4816-8e44-f792ea626299-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.465557 4825 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/19bd5f67-ab1b-4816-8e44-f792ea626299-server-conf\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.465584 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.465595 4825 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19bd5f67-ab1b-4816-8e44-f792ea626299-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.465605 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s4gz\" (UniqueName: \"kubernetes.io/projected/19bd5f67-ab1b-4816-8e44-f792ea626299-kube-api-access-7s4gz\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.465616 4825 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19bd5f67-ab1b-4816-8e44-f792ea626299-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.465624 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19bd5f67-ab1b-4816-8e44-f792ea626299-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.465632 4825 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19bd5f67-ab1b-4816-8e44-f792ea626299-pod-info\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.465640 4825 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19bd5f67-ab1b-4816-8e44-f792ea626299-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.499478 4825 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.553209 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19bd5f67-ab1b-4816-8e44-f792ea626299-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "19bd5f67-ab1b-4816-8e44-f792ea626299" (UID: "19bd5f67-ab1b-4816-8e44-f792ea626299"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.567397 4825 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.567436 4825 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19bd5f67-ab1b-4816-8e44-f792ea626299-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.608618 4825 generic.go:334] "Generic (PLEG): container finished" podID="f43a8cb5-b546-476e-a429-12947216e9b0" containerID="6c136e84a59f4391d0345e16fc9393797c5051f0fd6778063acfdfa39b8ede8c" exitCode=0 Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.608739 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f43a8cb5-b546-476e-a429-12947216e9b0","Type":"ContainerDied","Data":"6c136e84a59f4391d0345e16fc9393797c5051f0fd6778063acfdfa39b8ede8c"} Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.620984 4825 generic.go:334] "Generic (PLEG): container finished" podID="19bd5f67-ab1b-4816-8e44-f792ea626299" containerID="b578eafcd8ed7708e8777b91b6e61504238a46aa50b111e1dd945c5896b307c7" exitCode=0 Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.621041 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"19bd5f67-ab1b-4816-8e44-f792ea626299","Type":"ContainerDied","Data":"b578eafcd8ed7708e8777b91b6e61504238a46aa50b111e1dd945c5896b307c7"} Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.621071 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"19bd5f67-ab1b-4816-8e44-f792ea626299","Type":"ContainerDied","Data":"11d18c93ae0a82bb132f609a3f87567fb3a8a2563fb9570d15f100728dc8a27e"} Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.621094 4825 scope.go:117] "RemoveContainer" containerID="b578eafcd8ed7708e8777b91b6e61504238a46aa50b111e1dd945c5896b307c7" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.621363 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.640819 4825 scope.go:117] "RemoveContainer" containerID="1e33937e19cfcaed27b27c9c08b402c1fd53714ac47cb052f08e88906bd93bad" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.661268 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.679219 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.688949 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 19:22:17 crc kubenswrapper[4825]: E1007 19:22:17.689460 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19bd5f67-ab1b-4816-8e44-f792ea626299" containerName="setup-container" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.689476 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="19bd5f67-ab1b-4816-8e44-f792ea626299" containerName="setup-container" Oct 07 19:22:17 crc kubenswrapper[4825]: E1007 19:22:17.689493 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19bd5f67-ab1b-4816-8e44-f792ea626299" containerName="rabbitmq" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.689499 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="19bd5f67-ab1b-4816-8e44-f792ea626299" containerName="rabbitmq" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.689717 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="19bd5f67-ab1b-4816-8e44-f792ea626299" containerName="rabbitmq" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.691475 4825 scope.go:117] "RemoveContainer" containerID="b578eafcd8ed7708e8777b91b6e61504238a46aa50b111e1dd945c5896b307c7" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.692896 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: E1007 19:22:17.694313 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b578eafcd8ed7708e8777b91b6e61504238a46aa50b111e1dd945c5896b307c7\": container with ID starting with b578eafcd8ed7708e8777b91b6e61504238a46aa50b111e1dd945c5896b307c7 not found: ID does not exist" containerID="b578eafcd8ed7708e8777b91b6e61504238a46aa50b111e1dd945c5896b307c7" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.694337 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b578eafcd8ed7708e8777b91b6e61504238a46aa50b111e1dd945c5896b307c7"} err="failed to get container status \"b578eafcd8ed7708e8777b91b6e61504238a46aa50b111e1dd945c5896b307c7\": rpc error: code = NotFound desc = could not find container \"b578eafcd8ed7708e8777b91b6e61504238a46aa50b111e1dd945c5896b307c7\": container with ID starting with b578eafcd8ed7708e8777b91b6e61504238a46aa50b111e1dd945c5896b307c7 not found: ID does not exist" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.694357 4825 scope.go:117] "RemoveContainer" containerID="1e33937e19cfcaed27b27c9c08b402c1fd53714ac47cb052f08e88906bd93bad" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.696925 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-jffjz" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.697068 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.697167 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.697405 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.697410 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.697544 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.697620 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 07 19:22:17 crc kubenswrapper[4825]: E1007 19:22:17.701750 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e33937e19cfcaed27b27c9c08b402c1fd53714ac47cb052f08e88906bd93bad\": container with ID starting with 1e33937e19cfcaed27b27c9c08b402c1fd53714ac47cb052f08e88906bd93bad not found: ID does not exist" containerID="1e33937e19cfcaed27b27c9c08b402c1fd53714ac47cb052f08e88906bd93bad" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.701785 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e33937e19cfcaed27b27c9c08b402c1fd53714ac47cb052f08e88906bd93bad"} err="failed to get container status \"1e33937e19cfcaed27b27c9c08b402c1fd53714ac47cb052f08e88906bd93bad\": rpc error: code = NotFound desc = could not find container \"1e33937e19cfcaed27b27c9c08b402c1fd53714ac47cb052f08e88906bd93bad\": container with ID starting with 1e33937e19cfcaed27b27c9c08b402c1fd53714ac47cb052f08e88906bd93bad not found: ID does not exist" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.709592 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.734248 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.771260 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/18c777f8-aad0-482a-b132-ad417d64eb6e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.771302 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/18c777f8-aad0-482a-b132-ad417d64eb6e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.771334 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l47m4\" (UniqueName: \"kubernetes.io/projected/18c777f8-aad0-482a-b132-ad417d64eb6e-kube-api-access-l47m4\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.771357 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18c777f8-aad0-482a-b132-ad417d64eb6e-config-data\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.771415 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/18c777f8-aad0-482a-b132-ad417d64eb6e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.771436 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/18c777f8-aad0-482a-b132-ad417d64eb6e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.771458 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/18c777f8-aad0-482a-b132-ad417d64eb6e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.771494 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/18c777f8-aad0-482a-b132-ad417d64eb6e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.771518 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/18c777f8-aad0-482a-b132-ad417d64eb6e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.771535 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.771555 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/18c777f8-aad0-482a-b132-ad417d64eb6e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.808852 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19bd5f67-ab1b-4816-8e44-f792ea626299" path="/var/lib/kubelet/pods/19bd5f67-ab1b-4816-8e44-f792ea626299/volumes" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.872212 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plwj5\" (UniqueName: \"kubernetes.io/projected/f43a8cb5-b546-476e-a429-12947216e9b0-kube-api-access-plwj5\") pod \"f43a8cb5-b546-476e-a429-12947216e9b0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.872295 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"f43a8cb5-b546-476e-a429-12947216e9b0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.872355 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f43a8cb5-b546-476e-a429-12947216e9b0-plugins-conf\") pod \"f43a8cb5-b546-476e-a429-12947216e9b0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.872415 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f43a8cb5-b546-476e-a429-12947216e9b0-server-conf\") pod \"f43a8cb5-b546-476e-a429-12947216e9b0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.872448 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f43a8cb5-b546-476e-a429-12947216e9b0-rabbitmq-plugins\") pod \"f43a8cb5-b546-476e-a429-12947216e9b0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.872511 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f43a8cb5-b546-476e-a429-12947216e9b0-rabbitmq-confd\") pod \"f43a8cb5-b546-476e-a429-12947216e9b0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.872547 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f43a8cb5-b546-476e-a429-12947216e9b0-rabbitmq-tls\") pod \"f43a8cb5-b546-476e-a429-12947216e9b0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.872591 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f43a8cb5-b546-476e-a429-12947216e9b0-erlang-cookie-secret\") pod \"f43a8cb5-b546-476e-a429-12947216e9b0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.872651 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f43a8cb5-b546-476e-a429-12947216e9b0-pod-info\") pod \"f43a8cb5-b546-476e-a429-12947216e9b0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.872721 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f43a8cb5-b546-476e-a429-12947216e9b0-rabbitmq-erlang-cookie\") pod \"f43a8cb5-b546-476e-a429-12947216e9b0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.872767 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f43a8cb5-b546-476e-a429-12947216e9b0-config-data\") pod \"f43a8cb5-b546-476e-a429-12947216e9b0\" (UID: \"f43a8cb5-b546-476e-a429-12947216e9b0\") " Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.873023 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/18c777f8-aad0-482a-b132-ad417d64eb6e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.873095 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/18c777f8-aad0-482a-b132-ad417d64eb6e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.873137 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/18c777f8-aad0-482a-b132-ad417d64eb6e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.873171 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.873197 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/18c777f8-aad0-482a-b132-ad417d64eb6e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.873296 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/18c777f8-aad0-482a-b132-ad417d64eb6e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.873320 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/18c777f8-aad0-482a-b132-ad417d64eb6e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.873362 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l47m4\" (UniqueName: \"kubernetes.io/projected/18c777f8-aad0-482a-b132-ad417d64eb6e-kube-api-access-l47m4\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.873396 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18c777f8-aad0-482a-b132-ad417d64eb6e-config-data\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.873506 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/18c777f8-aad0-482a-b132-ad417d64eb6e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.873529 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f43a8cb5-b546-476e-a429-12947216e9b0-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f43a8cb5-b546-476e-a429-12947216e9b0" (UID: "f43a8cb5-b546-476e-a429-12947216e9b0"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.873545 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/18c777f8-aad0-482a-b132-ad417d64eb6e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.873575 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f43a8cb5-b546-476e-a429-12947216e9b0-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f43a8cb5-b546-476e-a429-12947216e9b0" (UID: "f43a8cb5-b546-476e-a429-12947216e9b0"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.873640 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f43a8cb5-b546-476e-a429-12947216e9b0-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f43a8cb5-b546-476e-a429-12947216e9b0" (UID: "f43a8cb5-b546-476e-a429-12947216e9b0"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.873751 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.874485 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/18c777f8-aad0-482a-b132-ad417d64eb6e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.875198 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18c777f8-aad0-482a-b132-ad417d64eb6e-config-data\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.876621 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/18c777f8-aad0-482a-b132-ad417d64eb6e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.876900 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/18c777f8-aad0-482a-b132-ad417d64eb6e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.877824 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/18c777f8-aad0-482a-b132-ad417d64eb6e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.878151 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f43a8cb5-b546-476e-a429-12947216e9b0-kube-api-access-plwj5" (OuterVolumeSpecName: "kube-api-access-plwj5") pod "f43a8cb5-b546-476e-a429-12947216e9b0" (UID: "f43a8cb5-b546-476e-a429-12947216e9b0"). InnerVolumeSpecName "kube-api-access-plwj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.878159 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f43a8cb5-b546-476e-a429-12947216e9b0-pod-info" (OuterVolumeSpecName: "pod-info") pod "f43a8cb5-b546-476e-a429-12947216e9b0" (UID: "f43a8cb5-b546-476e-a429-12947216e9b0"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.879172 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/18c777f8-aad0-482a-b132-ad417d64eb6e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.879306 4825 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f43a8cb5-b546-476e-a429-12947216e9b0-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.879351 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/18c777f8-aad0-482a-b132-ad417d64eb6e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.880342 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/18c777f8-aad0-482a-b132-ad417d64eb6e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.883684 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f43a8cb5-b546-476e-a429-12947216e9b0-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "f43a8cb5-b546-476e-a429-12947216e9b0" (UID: "f43a8cb5-b546-476e-a429-12947216e9b0"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.896677 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "f43a8cb5-b546-476e-a429-12947216e9b0" (UID: "f43a8cb5-b546-476e-a429-12947216e9b0"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.896859 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43a8cb5-b546-476e-a429-12947216e9b0-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f43a8cb5-b546-476e-a429-12947216e9b0" (UID: "f43a8cb5-b546-476e-a429-12947216e9b0"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.898010 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l47m4\" (UniqueName: \"kubernetes.io/projected/18c777f8-aad0-482a-b132-ad417d64eb6e-kube-api-access-l47m4\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.900549 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/18c777f8-aad0-482a-b132-ad417d64eb6e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.922788 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f43a8cb5-b546-476e-a429-12947216e9b0-config-data" (OuterVolumeSpecName: "config-data") pod "f43a8cb5-b546-476e-a429-12947216e9b0" (UID: "f43a8cb5-b546-476e-a429-12947216e9b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.938937 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"18c777f8-aad0-482a-b132-ad417d64eb6e\") " pod="openstack/rabbitmq-server-0" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.964214 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f43a8cb5-b546-476e-a429-12947216e9b0-server-conf" (OuterVolumeSpecName: "server-conf") pod "f43a8cb5-b546-476e-a429-12947216e9b0" (UID: "f43a8cb5-b546-476e-a429-12947216e9b0"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.980796 4825 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f43a8cb5-b546-476e-a429-12947216e9b0-server-conf\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.980837 4825 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f43a8cb5-b546-476e-a429-12947216e9b0-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.980852 4825 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f43a8cb5-b546-476e-a429-12947216e9b0-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.980865 4825 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f43a8cb5-b546-476e-a429-12947216e9b0-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.980878 4825 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f43a8cb5-b546-476e-a429-12947216e9b0-pod-info\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.980892 4825 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f43a8cb5-b546-476e-a429-12947216e9b0-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.980903 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f43a8cb5-b546-476e-a429-12947216e9b0-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.980915 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plwj5\" (UniqueName: \"kubernetes.io/projected/f43a8cb5-b546-476e-a429-12947216e9b0-kube-api-access-plwj5\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.980948 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 07 19:22:17 crc kubenswrapper[4825]: I1007 19:22:17.986889 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f43a8cb5-b546-476e-a429-12947216e9b0-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f43a8cb5-b546-476e-a429-12947216e9b0" (UID: "f43a8cb5-b546-476e-a429-12947216e9b0"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.010140 4825 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.022172 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.083513 4825 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f43a8cb5-b546-476e-a429-12947216e9b0-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.083567 4825 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.338184 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.634898 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"18c777f8-aad0-482a-b132-ad417d64eb6e","Type":"ContainerStarted","Data":"b0aa422a49a1aea346745df7762de6607deaecb8c6a10edfd280672775cf2830"} Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.637331 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f43a8cb5-b546-476e-a429-12947216e9b0","Type":"ContainerDied","Data":"f5cbeb286174014d6baa852120d099182efb2b4a6bee1dd2828dc77645692273"} Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.637378 4825 scope.go:117] "RemoveContainer" containerID="6c136e84a59f4391d0345e16fc9393797c5051f0fd6778063acfdfa39b8ede8c" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.637402 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.663480 4825 scope.go:117] "RemoveContainer" containerID="e7089db7df0abbff50c6bfa62c1f82416a51470cf96c4044c29f1c4a871a3adc" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.669699 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.677661 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.700674 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 19:22:18 crc kubenswrapper[4825]: E1007 19:22:18.701032 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43a8cb5-b546-476e-a429-12947216e9b0" containerName="rabbitmq" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.701049 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43a8cb5-b546-476e-a429-12947216e9b0" containerName="rabbitmq" Oct 07 19:22:18 crc kubenswrapper[4825]: E1007 19:22:18.701066 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43a8cb5-b546-476e-a429-12947216e9b0" containerName="setup-container" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.701073 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43a8cb5-b546-476e-a429-12947216e9b0" containerName="setup-container" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.702019 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f43a8cb5-b546-476e-a429-12947216e9b0" containerName="rabbitmq" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.702915 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.704990 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.705423 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.706637 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.706961 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-vvsrv" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.707018 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.707084 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.707351 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.719682 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.797329 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e773083b-ae36-44eb-bb82-18b12b504439-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.797624 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e773083b-ae36-44eb-bb82-18b12b504439-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.797729 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e773083b-ae36-44eb-bb82-18b12b504439-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.797883 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e773083b-ae36-44eb-bb82-18b12b504439-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.798045 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e773083b-ae36-44eb-bb82-18b12b504439-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.798250 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.798368 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e773083b-ae36-44eb-bb82-18b12b504439-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.798396 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkqjr\" (UniqueName: \"kubernetes.io/projected/e773083b-ae36-44eb-bb82-18b12b504439-kube-api-access-lkqjr\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.798432 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e773083b-ae36-44eb-bb82-18b12b504439-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.798456 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e773083b-ae36-44eb-bb82-18b12b504439-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.798483 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e773083b-ae36-44eb-bb82-18b12b504439-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.900862 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e773083b-ae36-44eb-bb82-18b12b504439-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.900977 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e773083b-ae36-44eb-bb82-18b12b504439-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.901043 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.901213 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e773083b-ae36-44eb-bb82-18b12b504439-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.901278 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkqjr\" (UniqueName: \"kubernetes.io/projected/e773083b-ae36-44eb-bb82-18b12b504439-kube-api-access-lkqjr\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.901367 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e773083b-ae36-44eb-bb82-18b12b504439-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.901444 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e773083b-ae36-44eb-bb82-18b12b504439-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.901517 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e773083b-ae36-44eb-bb82-18b12b504439-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.901584 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e773083b-ae36-44eb-bb82-18b12b504439-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.901637 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e773083b-ae36-44eb-bb82-18b12b504439-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.901668 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.901703 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e773083b-ae36-44eb-bb82-18b12b504439-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.901844 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e773083b-ae36-44eb-bb82-18b12b504439-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.903533 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e773083b-ae36-44eb-bb82-18b12b504439-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.904279 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e773083b-ae36-44eb-bb82-18b12b504439-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.904474 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e773083b-ae36-44eb-bb82-18b12b504439-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.905171 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e773083b-ae36-44eb-bb82-18b12b504439-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.907105 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e773083b-ae36-44eb-bb82-18b12b504439-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.907868 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e773083b-ae36-44eb-bb82-18b12b504439-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.908890 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e773083b-ae36-44eb-bb82-18b12b504439-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.911788 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e773083b-ae36-44eb-bb82-18b12b504439-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.924206 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkqjr\" (UniqueName: \"kubernetes.io/projected/e773083b-ae36-44eb-bb82-18b12b504439-kube-api-access-lkqjr\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:18 crc kubenswrapper[4825]: I1007 19:22:18.936691 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e773083b-ae36-44eb-bb82-18b12b504439\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.021960 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.258296 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-dt42p"] Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.262371 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.265499 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-dt42p"] Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.269170 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.308355 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-dt42p\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.308415 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-dt42p\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.308481 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-dt42p\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.308509 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-dns-svc\") pod \"dnsmasq-dns-67b789f86c-dt42p\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.308540 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj8cd\" (UniqueName: \"kubernetes.io/projected/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-kube-api-access-bj8cd\") pod \"dnsmasq-dns-67b789f86c-dt42p\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.308636 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-config\") pod \"dnsmasq-dns-67b789f86c-dt42p\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.311112 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-dt42p\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.412980 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-dt42p\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.413030 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-dns-svc\") pod \"dnsmasq-dns-67b789f86c-dt42p\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.413065 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj8cd\" (UniqueName: \"kubernetes.io/projected/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-kube-api-access-bj8cd\") pod \"dnsmasq-dns-67b789f86c-dt42p\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.413119 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-config\") pod \"dnsmasq-dns-67b789f86c-dt42p\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.413142 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-dt42p\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.413172 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-dt42p\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.413194 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-dt42p\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.413857 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-dt42p\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.413981 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-dns-svc\") pod \"dnsmasq-dns-67b789f86c-dt42p\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.414020 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-dt42p\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.414446 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-dt42p\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.414463 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-dt42p\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.414988 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-config\") pod \"dnsmasq-dns-67b789f86c-dt42p\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.430206 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj8cd\" (UniqueName: \"kubernetes.io/projected/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-kube-api-access-bj8cd\") pod \"dnsmasq-dns-67b789f86c-dt42p\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.516617 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.589860 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:19 crc kubenswrapper[4825]: W1007 19:22:19.649341 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode773083b_ae36_44eb_bb82_18b12b504439.slice/crio-dc9a2a8e9a4d4b1a2165fe589d35089b5b80734c953ab721ff011fffa6fea80e WatchSource:0}: Error finding container dc9a2a8e9a4d4b1a2165fe589d35089b5b80734c953ab721ff011fffa6fea80e: Status 404 returned error can't find the container with id dc9a2a8e9a4d4b1a2165fe589d35089b5b80734c953ab721ff011fffa6fea80e Oct 07 19:22:19 crc kubenswrapper[4825]: I1007 19:22:19.816333 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f43a8cb5-b546-476e-a429-12947216e9b0" path="/var/lib/kubelet/pods/f43a8cb5-b546-476e-a429-12947216e9b0/volumes" Oct 07 19:22:20 crc kubenswrapper[4825]: I1007 19:22:20.110562 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-dt42p"] Oct 07 19:22:20 crc kubenswrapper[4825]: I1007 19:22:20.657293 4825 generic.go:334] "Generic (PLEG): container finished" podID="6716deb6-110b-43b0-a4f7-9897d1c9e3bd" containerID="2e790c9fb93d31c2f0b7bc6845c9f58513f0ce49b3fd10059b3f66308a39231c" exitCode=0 Oct 07 19:22:20 crc kubenswrapper[4825]: I1007 19:22:20.657397 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-dt42p" event={"ID":"6716deb6-110b-43b0-a4f7-9897d1c9e3bd","Type":"ContainerDied","Data":"2e790c9fb93d31c2f0b7bc6845c9f58513f0ce49b3fd10059b3f66308a39231c"} Oct 07 19:22:20 crc kubenswrapper[4825]: I1007 19:22:20.657429 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-dt42p" event={"ID":"6716deb6-110b-43b0-a4f7-9897d1c9e3bd","Type":"ContainerStarted","Data":"b7a8b6d22cd5750225d6d80440099f7c534e05180ac8c39544a540ca2781fc60"} Oct 07 19:22:20 crc kubenswrapper[4825]: I1007 19:22:20.660495 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"18c777f8-aad0-482a-b132-ad417d64eb6e","Type":"ContainerStarted","Data":"f8ddf6df8392180177a9608cc374d7b84023de7f3cffa9ad21070d8eea70c927"} Oct 07 19:22:20 crc kubenswrapper[4825]: I1007 19:22:20.661876 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e773083b-ae36-44eb-bb82-18b12b504439","Type":"ContainerStarted","Data":"dc9a2a8e9a4d4b1a2165fe589d35089b5b80734c953ab721ff011fffa6fea80e"} Oct 07 19:22:21 crc kubenswrapper[4825]: I1007 19:22:21.670154 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e773083b-ae36-44eb-bb82-18b12b504439","Type":"ContainerStarted","Data":"9b796568611911d3c7592d7ff963cb1852c68089a96641586d73d116ab1b337b"} Oct 07 19:22:21 crc kubenswrapper[4825]: I1007 19:22:21.673294 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-dt42p" event={"ID":"6716deb6-110b-43b0-a4f7-9897d1c9e3bd","Type":"ContainerStarted","Data":"08cc1b2129120a368df7e7ff23b28074ef76fcc2fc7b5466f66740384625595b"} Oct 07 19:22:21 crc kubenswrapper[4825]: I1007 19:22:21.725052 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67b789f86c-dt42p" podStartSLOduration=2.7250381409999997 podStartE2EDuration="2.725038141s" podCreationTimestamp="2025-10-07 19:22:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:22:21.718921716 +0000 UTC m=+1330.540960353" watchObservedRunningTime="2025-10-07 19:22:21.725038141 +0000 UTC m=+1330.547076778" Oct 07 19:22:22 crc kubenswrapper[4825]: I1007 19:22:22.685647 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:29 crc kubenswrapper[4825]: I1007 19:22:29.591515 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:29 crc kubenswrapper[4825]: I1007 19:22:29.700145 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-l6vtv"] Oct 07 19:22:29 crc kubenswrapper[4825]: I1007 19:22:29.700746 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" podUID="68dafb60-bc0b-4329-9114-91a5b6d5bfe8" containerName="dnsmasq-dns" containerID="cri-o://cf0f8eb92d6c94287a0a3820a9a1ff21119291d7547c2662cf3b54fe0e6d8ff8" gracePeriod=10 Oct 07 19:22:29 crc kubenswrapper[4825]: I1007 19:22:29.832128 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-gkpsf"] Oct 07 19:22:29 crc kubenswrapper[4825]: I1007 19:22:29.834140 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:29 crc kubenswrapper[4825]: I1007 19:22:29.845811 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-gkpsf"] Oct 07 19:22:29 crc kubenswrapper[4825]: I1007 19:22:29.936671 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwblb\" (UniqueName: \"kubernetes.io/projected/4dbb4b22-9fab-40a8-8fee-1d77e4e37c80-kube-api-access-jwblb\") pod \"dnsmasq-dns-cb6ffcf87-gkpsf\" (UID: \"4dbb4b22-9fab-40a8-8fee-1d77e4e37c80\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:29 crc kubenswrapper[4825]: I1007 19:22:29.936756 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4dbb4b22-9fab-40a8-8fee-1d77e4e37c80-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-gkpsf\" (UID: \"4dbb4b22-9fab-40a8-8fee-1d77e4e37c80\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:29 crc kubenswrapper[4825]: I1007 19:22:29.936781 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4dbb4b22-9fab-40a8-8fee-1d77e4e37c80-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-gkpsf\" (UID: \"4dbb4b22-9fab-40a8-8fee-1d77e4e37c80\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:29 crc kubenswrapper[4825]: I1007 19:22:29.936860 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4dbb4b22-9fab-40a8-8fee-1d77e4e37c80-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-gkpsf\" (UID: \"4dbb4b22-9fab-40a8-8fee-1d77e4e37c80\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:29 crc kubenswrapper[4825]: I1007 19:22:29.936929 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4dbb4b22-9fab-40a8-8fee-1d77e4e37c80-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-gkpsf\" (UID: \"4dbb4b22-9fab-40a8-8fee-1d77e4e37c80\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:29 crc kubenswrapper[4825]: I1007 19:22:29.937187 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dbb4b22-9fab-40a8-8fee-1d77e4e37c80-config\") pod \"dnsmasq-dns-cb6ffcf87-gkpsf\" (UID: \"4dbb4b22-9fab-40a8-8fee-1d77e4e37c80\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:29 crc kubenswrapper[4825]: I1007 19:22:29.937284 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4dbb4b22-9fab-40a8-8fee-1d77e4e37c80-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-gkpsf\" (UID: \"4dbb4b22-9fab-40a8-8fee-1d77e4e37c80\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.039183 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4dbb4b22-9fab-40a8-8fee-1d77e4e37c80-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-gkpsf\" (UID: \"4dbb4b22-9fab-40a8-8fee-1d77e4e37c80\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.039384 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4dbb4b22-9fab-40a8-8fee-1d77e4e37c80-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-gkpsf\" (UID: \"4dbb4b22-9fab-40a8-8fee-1d77e4e37c80\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.039441 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dbb4b22-9fab-40a8-8fee-1d77e4e37c80-config\") pod \"dnsmasq-dns-cb6ffcf87-gkpsf\" (UID: \"4dbb4b22-9fab-40a8-8fee-1d77e4e37c80\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.039480 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwblb\" (UniqueName: \"kubernetes.io/projected/4dbb4b22-9fab-40a8-8fee-1d77e4e37c80-kube-api-access-jwblb\") pod \"dnsmasq-dns-cb6ffcf87-gkpsf\" (UID: \"4dbb4b22-9fab-40a8-8fee-1d77e4e37c80\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.039537 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4dbb4b22-9fab-40a8-8fee-1d77e4e37c80-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-gkpsf\" (UID: \"4dbb4b22-9fab-40a8-8fee-1d77e4e37c80\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.039561 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4dbb4b22-9fab-40a8-8fee-1d77e4e37c80-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-gkpsf\" (UID: \"4dbb4b22-9fab-40a8-8fee-1d77e4e37c80\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.039703 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4dbb4b22-9fab-40a8-8fee-1d77e4e37c80-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-gkpsf\" (UID: \"4dbb4b22-9fab-40a8-8fee-1d77e4e37c80\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.040812 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4dbb4b22-9fab-40a8-8fee-1d77e4e37c80-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-gkpsf\" (UID: \"4dbb4b22-9fab-40a8-8fee-1d77e4e37c80\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.041547 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4dbb4b22-9fab-40a8-8fee-1d77e4e37c80-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-gkpsf\" (UID: \"4dbb4b22-9fab-40a8-8fee-1d77e4e37c80\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.042141 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4dbb4b22-9fab-40a8-8fee-1d77e4e37c80-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-gkpsf\" (UID: \"4dbb4b22-9fab-40a8-8fee-1d77e4e37c80\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.042706 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dbb4b22-9fab-40a8-8fee-1d77e4e37c80-config\") pod \"dnsmasq-dns-cb6ffcf87-gkpsf\" (UID: \"4dbb4b22-9fab-40a8-8fee-1d77e4e37c80\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.043652 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4dbb4b22-9fab-40a8-8fee-1d77e4e37c80-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-gkpsf\" (UID: \"4dbb4b22-9fab-40a8-8fee-1d77e4e37c80\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.044257 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4dbb4b22-9fab-40a8-8fee-1d77e4e37c80-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-gkpsf\" (UID: \"4dbb4b22-9fab-40a8-8fee-1d77e4e37c80\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.086653 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwblb\" (UniqueName: \"kubernetes.io/projected/4dbb4b22-9fab-40a8-8fee-1d77e4e37c80-kube-api-access-jwblb\") pod \"dnsmasq-dns-cb6ffcf87-gkpsf\" (UID: \"4dbb4b22-9fab-40a8-8fee-1d77e4e37c80\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.197634 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.219828 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.344580 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-config\") pod \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\" (UID: \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\") " Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.344651 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-dns-svc\") pod \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\" (UID: \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\") " Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.344712 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc2n7\" (UniqueName: \"kubernetes.io/projected/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-kube-api-access-zc2n7\") pod \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\" (UID: \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\") " Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.344739 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-ovsdbserver-nb\") pod \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\" (UID: \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\") " Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.344836 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-ovsdbserver-sb\") pod \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\" (UID: \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\") " Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.344931 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-dns-swift-storage-0\") pod \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\" (UID: \"68dafb60-bc0b-4329-9114-91a5b6d5bfe8\") " Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.349638 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-kube-api-access-zc2n7" (OuterVolumeSpecName: "kube-api-access-zc2n7") pod "68dafb60-bc0b-4329-9114-91a5b6d5bfe8" (UID: "68dafb60-bc0b-4329-9114-91a5b6d5bfe8"). InnerVolumeSpecName "kube-api-access-zc2n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.396512 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "68dafb60-bc0b-4329-9114-91a5b6d5bfe8" (UID: "68dafb60-bc0b-4329-9114-91a5b6d5bfe8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.399622 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-config" (OuterVolumeSpecName: "config") pod "68dafb60-bc0b-4329-9114-91a5b6d5bfe8" (UID: "68dafb60-bc0b-4329-9114-91a5b6d5bfe8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.402911 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "68dafb60-bc0b-4329-9114-91a5b6d5bfe8" (UID: "68dafb60-bc0b-4329-9114-91a5b6d5bfe8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.409296 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "68dafb60-bc0b-4329-9114-91a5b6d5bfe8" (UID: "68dafb60-bc0b-4329-9114-91a5b6d5bfe8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.410477 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "68dafb60-bc0b-4329-9114-91a5b6d5bfe8" (UID: "68dafb60-bc0b-4329-9114-91a5b6d5bfe8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.448589 4825 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.448623 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.448632 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.448641 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc2n7\" (UniqueName: \"kubernetes.io/projected/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-kube-api-access-zc2n7\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.448651 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.448659 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68dafb60-bc0b-4329-9114-91a5b6d5bfe8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.660099 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-gkpsf"] Oct 07 19:22:30 crc kubenswrapper[4825]: W1007 19:22:30.660688 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dbb4b22_9fab_40a8_8fee_1d77e4e37c80.slice/crio-519d041027127a7c610e23fe58ff971bf880eec0b3699a6818ca7a2581c0362d WatchSource:0}: Error finding container 519d041027127a7c610e23fe58ff971bf880eec0b3699a6818ca7a2581c0362d: Status 404 returned error can't find the container with id 519d041027127a7c610e23fe58ff971bf880eec0b3699a6818ca7a2581c0362d Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.794472 4825 generic.go:334] "Generic (PLEG): container finished" podID="68dafb60-bc0b-4329-9114-91a5b6d5bfe8" containerID="cf0f8eb92d6c94287a0a3820a9a1ff21119291d7547c2662cf3b54fe0e6d8ff8" exitCode=0 Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.794599 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" event={"ID":"68dafb60-bc0b-4329-9114-91a5b6d5bfe8","Type":"ContainerDied","Data":"cf0f8eb92d6c94287a0a3820a9a1ff21119291d7547c2662cf3b54fe0e6d8ff8"} Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.794626 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" event={"ID":"68dafb60-bc0b-4329-9114-91a5b6d5bfe8","Type":"ContainerDied","Data":"7db2880860662f1be563985de336c2ff15c7a2d6ba89409d24b34301777afd39"} Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.794646 4825 scope.go:117] "RemoveContainer" containerID="cf0f8eb92d6c94287a0a3820a9a1ff21119291d7547c2662cf3b54fe0e6d8ff8" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.794799 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-l6vtv" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.797464 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" event={"ID":"4dbb4b22-9fab-40a8-8fee-1d77e4e37c80","Type":"ContainerStarted","Data":"519d041027127a7c610e23fe58ff971bf880eec0b3699a6818ca7a2581c0362d"} Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.821995 4825 scope.go:117] "RemoveContainer" containerID="6a48430f847ce6c39c66e614a973d54ad1ebe29392a4bc1b8a691c87d0dbfb60" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.978492 4825 scope.go:117] "RemoveContainer" containerID="cf0f8eb92d6c94287a0a3820a9a1ff21119291d7547c2662cf3b54fe0e6d8ff8" Oct 07 19:22:30 crc kubenswrapper[4825]: E1007 19:22:30.979455 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf0f8eb92d6c94287a0a3820a9a1ff21119291d7547c2662cf3b54fe0e6d8ff8\": container with ID starting with cf0f8eb92d6c94287a0a3820a9a1ff21119291d7547c2662cf3b54fe0e6d8ff8 not found: ID does not exist" containerID="cf0f8eb92d6c94287a0a3820a9a1ff21119291d7547c2662cf3b54fe0e6d8ff8" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.979486 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf0f8eb92d6c94287a0a3820a9a1ff21119291d7547c2662cf3b54fe0e6d8ff8"} err="failed to get container status \"cf0f8eb92d6c94287a0a3820a9a1ff21119291d7547c2662cf3b54fe0e6d8ff8\": rpc error: code = NotFound desc = could not find container \"cf0f8eb92d6c94287a0a3820a9a1ff21119291d7547c2662cf3b54fe0e6d8ff8\": container with ID starting with cf0f8eb92d6c94287a0a3820a9a1ff21119291d7547c2662cf3b54fe0e6d8ff8 not found: ID does not exist" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.979510 4825 scope.go:117] "RemoveContainer" containerID="6a48430f847ce6c39c66e614a973d54ad1ebe29392a4bc1b8a691c87d0dbfb60" Oct 07 19:22:30 crc kubenswrapper[4825]: E1007 19:22:30.981551 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a48430f847ce6c39c66e614a973d54ad1ebe29392a4bc1b8a691c87d0dbfb60\": container with ID starting with 6a48430f847ce6c39c66e614a973d54ad1ebe29392a4bc1b8a691c87d0dbfb60 not found: ID does not exist" containerID="6a48430f847ce6c39c66e614a973d54ad1ebe29392a4bc1b8a691c87d0dbfb60" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.981605 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a48430f847ce6c39c66e614a973d54ad1ebe29392a4bc1b8a691c87d0dbfb60"} err="failed to get container status \"6a48430f847ce6c39c66e614a973d54ad1ebe29392a4bc1b8a691c87d0dbfb60\": rpc error: code = NotFound desc = could not find container \"6a48430f847ce6c39c66e614a973d54ad1ebe29392a4bc1b8a691c87d0dbfb60\": container with ID starting with 6a48430f847ce6c39c66e614a973d54ad1ebe29392a4bc1b8a691c87d0dbfb60 not found: ID does not exist" Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.992846 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-l6vtv"] Oct 07 19:22:30 crc kubenswrapper[4825]: I1007 19:22:30.999854 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-l6vtv"] Oct 07 19:22:31 crc kubenswrapper[4825]: I1007 19:22:31.811950 4825 generic.go:334] "Generic (PLEG): container finished" podID="4dbb4b22-9fab-40a8-8fee-1d77e4e37c80" containerID="af5f5677460e56473cee527a37622277b34ed6be9eccc9fb46dc151e576a1059" exitCode=0 Oct 07 19:22:31 crc kubenswrapper[4825]: I1007 19:22:31.812331 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68dafb60-bc0b-4329-9114-91a5b6d5bfe8" path="/var/lib/kubelet/pods/68dafb60-bc0b-4329-9114-91a5b6d5bfe8/volumes" Oct 07 19:22:31 crc kubenswrapper[4825]: I1007 19:22:31.813086 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" event={"ID":"4dbb4b22-9fab-40a8-8fee-1d77e4e37c80","Type":"ContainerDied","Data":"af5f5677460e56473cee527a37622277b34ed6be9eccc9fb46dc151e576a1059"} Oct 07 19:22:32 crc kubenswrapper[4825]: I1007 19:22:32.832198 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" event={"ID":"4dbb4b22-9fab-40a8-8fee-1d77e4e37c80","Type":"ContainerStarted","Data":"a6807e36eda2de86ad40ec083883cc70fe5959329e6ba1390e79c0fcdd9b0193"} Oct 07 19:22:32 crc kubenswrapper[4825]: I1007 19:22:32.832733 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:32 crc kubenswrapper[4825]: I1007 19:22:32.876982 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" podStartSLOduration=3.8769555110000002 podStartE2EDuration="3.876955511s" podCreationTimestamp="2025-10-07 19:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:22:32.86593645 +0000 UTC m=+1341.687975117" watchObservedRunningTime="2025-10-07 19:22:32.876955511 +0000 UTC m=+1341.698994188" Oct 07 19:22:40 crc kubenswrapper[4825]: I1007 19:22:40.199488 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb6ffcf87-gkpsf" Oct 07 19:22:40 crc kubenswrapper[4825]: I1007 19:22:40.298900 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-dt42p"] Oct 07 19:22:40 crc kubenswrapper[4825]: I1007 19:22:40.299211 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67b789f86c-dt42p" podUID="6716deb6-110b-43b0-a4f7-9897d1c9e3bd" containerName="dnsmasq-dns" containerID="cri-o://08cc1b2129120a368df7e7ff23b28074ef76fcc2fc7b5466f66740384625595b" gracePeriod=10 Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:40.780071 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:40.878411 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-config\") pod \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:40.878834 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-ovsdbserver-sb\") pod \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:40.879325 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj8cd\" (UniqueName: \"kubernetes.io/projected/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-kube-api-access-bj8cd\") pod \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:40.879395 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-ovsdbserver-nb\") pod \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:40.879435 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-dns-svc\") pod \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:40.879498 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-openstack-edpm-ipam\") pod \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:40.879519 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-dns-swift-storage-0\") pod \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\" (UID: \"6716deb6-110b-43b0-a4f7-9897d1c9e3bd\") " Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:40.889811 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-kube-api-access-bj8cd" (OuterVolumeSpecName: "kube-api-access-bj8cd") pod "6716deb6-110b-43b0-a4f7-9897d1c9e3bd" (UID: "6716deb6-110b-43b0-a4f7-9897d1c9e3bd"). InnerVolumeSpecName "kube-api-access-bj8cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:40.943285 4825 generic.go:334] "Generic (PLEG): container finished" podID="6716deb6-110b-43b0-a4f7-9897d1c9e3bd" containerID="08cc1b2129120a368df7e7ff23b28074ef76fcc2fc7b5466f66740384625595b" exitCode=0 Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:40.943331 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-dt42p" event={"ID":"6716deb6-110b-43b0-a4f7-9897d1c9e3bd","Type":"ContainerDied","Data":"08cc1b2129120a368df7e7ff23b28074ef76fcc2fc7b5466f66740384625595b"} Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:40.943364 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-dt42p" event={"ID":"6716deb6-110b-43b0-a4f7-9897d1c9e3bd","Type":"ContainerDied","Data":"b7a8b6d22cd5750225d6d80440099f7c534e05180ac8c39544a540ca2781fc60"} Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:40.943384 4825 scope.go:117] "RemoveContainer" containerID="08cc1b2129120a368df7e7ff23b28074ef76fcc2fc7b5466f66740384625595b" Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:40.943520 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-dt42p" Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:40.956158 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "6716deb6-110b-43b0-a4f7-9897d1c9e3bd" (UID: "6716deb6-110b-43b0-a4f7-9897d1c9e3bd"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:40.966041 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6716deb6-110b-43b0-a4f7-9897d1c9e3bd" (UID: "6716deb6-110b-43b0-a4f7-9897d1c9e3bd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:40.972807 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-config" (OuterVolumeSpecName: "config") pod "6716deb6-110b-43b0-a4f7-9897d1c9e3bd" (UID: "6716deb6-110b-43b0-a4f7-9897d1c9e3bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:40.981448 4825 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:40.981468 4825 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:40.981479 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:40.981487 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bj8cd\" (UniqueName: \"kubernetes.io/projected/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-kube-api-access-bj8cd\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:40.992329 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6716deb6-110b-43b0-a4f7-9897d1c9e3bd" (UID: "6716deb6-110b-43b0-a4f7-9897d1c9e3bd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:40.994185 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6716deb6-110b-43b0-a4f7-9897d1c9e3bd" (UID: "6716deb6-110b-43b0-a4f7-9897d1c9e3bd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:41.000435 4825 scope.go:117] "RemoveContainer" containerID="2e790c9fb93d31c2f0b7bc6845c9f58513f0ce49b3fd10059b3f66308a39231c" Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:41.002729 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6716deb6-110b-43b0-a4f7-9897d1c9e3bd" (UID: "6716deb6-110b-43b0-a4f7-9897d1c9e3bd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:41.017991 4825 scope.go:117] "RemoveContainer" containerID="08cc1b2129120a368df7e7ff23b28074ef76fcc2fc7b5466f66740384625595b" Oct 07 19:22:41 crc kubenswrapper[4825]: E1007 19:22:41.018217 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08cc1b2129120a368df7e7ff23b28074ef76fcc2fc7b5466f66740384625595b\": container with ID starting with 08cc1b2129120a368df7e7ff23b28074ef76fcc2fc7b5466f66740384625595b not found: ID does not exist" containerID="08cc1b2129120a368df7e7ff23b28074ef76fcc2fc7b5466f66740384625595b" Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:41.018252 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08cc1b2129120a368df7e7ff23b28074ef76fcc2fc7b5466f66740384625595b"} err="failed to get container status \"08cc1b2129120a368df7e7ff23b28074ef76fcc2fc7b5466f66740384625595b\": rpc error: code = NotFound desc = could not find container \"08cc1b2129120a368df7e7ff23b28074ef76fcc2fc7b5466f66740384625595b\": container with ID starting with 08cc1b2129120a368df7e7ff23b28074ef76fcc2fc7b5466f66740384625595b not found: ID does not exist" Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:41.018272 4825 scope.go:117] "RemoveContainer" containerID="2e790c9fb93d31c2f0b7bc6845c9f58513f0ce49b3fd10059b3f66308a39231c" Oct 07 19:22:41 crc kubenswrapper[4825]: E1007 19:22:41.018550 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e790c9fb93d31c2f0b7bc6845c9f58513f0ce49b3fd10059b3f66308a39231c\": container with ID starting with 2e790c9fb93d31c2f0b7bc6845c9f58513f0ce49b3fd10059b3f66308a39231c not found: ID does not exist" containerID="2e790c9fb93d31c2f0b7bc6845c9f58513f0ce49b3fd10059b3f66308a39231c" Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:41.018582 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e790c9fb93d31c2f0b7bc6845c9f58513f0ce49b3fd10059b3f66308a39231c"} err="failed to get container status \"2e790c9fb93d31c2f0b7bc6845c9f58513f0ce49b3fd10059b3f66308a39231c\": rpc error: code = NotFound desc = could not find container \"2e790c9fb93d31c2f0b7bc6845c9f58513f0ce49b3fd10059b3f66308a39231c\": container with ID starting with 2e790c9fb93d31c2f0b7bc6845c9f58513f0ce49b3fd10059b3f66308a39231c not found: ID does not exist" Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:41.083976 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:41.083995 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:41.084004 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6716deb6-110b-43b0-a4f7-9897d1c9e3bd-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:41.282519 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-dt42p"] Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:41.289246 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-dt42p"] Oct 07 19:22:41 crc kubenswrapper[4825]: I1007 19:22:41.814139 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6716deb6-110b-43b0-a4f7-9897d1c9e3bd" path="/var/lib/kubelet/pods/6716deb6-110b-43b0-a4f7-9897d1c9e3bd/volumes" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.078948 4825 generic.go:334] "Generic (PLEG): container finished" podID="18c777f8-aad0-482a-b132-ad417d64eb6e" containerID="f8ddf6df8392180177a9608cc374d7b84023de7f3cffa9ad21070d8eea70c927" exitCode=0 Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.079035 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"18c777f8-aad0-482a-b132-ad417d64eb6e","Type":"ContainerDied","Data":"f8ddf6df8392180177a9608cc374d7b84023de7f3cffa9ad21070d8eea70c927"} Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.623050 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw"] Oct 07 19:22:53 crc kubenswrapper[4825]: E1007 19:22:53.623710 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6716deb6-110b-43b0-a4f7-9897d1c9e3bd" containerName="dnsmasq-dns" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.623729 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6716deb6-110b-43b0-a4f7-9897d1c9e3bd" containerName="dnsmasq-dns" Oct 07 19:22:53 crc kubenswrapper[4825]: E1007 19:22:53.623738 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6716deb6-110b-43b0-a4f7-9897d1c9e3bd" containerName="init" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.623745 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6716deb6-110b-43b0-a4f7-9897d1c9e3bd" containerName="init" Oct 07 19:22:53 crc kubenswrapper[4825]: E1007 19:22:53.623780 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68dafb60-bc0b-4329-9114-91a5b6d5bfe8" containerName="dnsmasq-dns" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.623788 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="68dafb60-bc0b-4329-9114-91a5b6d5bfe8" containerName="dnsmasq-dns" Oct 07 19:22:53 crc kubenswrapper[4825]: E1007 19:22:53.623796 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68dafb60-bc0b-4329-9114-91a5b6d5bfe8" containerName="init" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.623823 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="68dafb60-bc0b-4329-9114-91a5b6d5bfe8" containerName="init" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.623985 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="68dafb60-bc0b-4329-9114-91a5b6d5bfe8" containerName="dnsmasq-dns" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.624002 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="6716deb6-110b-43b0-a4f7-9897d1c9e3bd" containerName="dnsmasq-dns" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.624635 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.626123 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.628173 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.628820 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.630179 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lr8sm" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.653114 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw"] Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.757011 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00148c9a-f926-4ff0-a78a-239fae3968d5-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw\" (UID: \"00148c9a-f926-4ff0-a78a-239fae3968d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.757111 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/00148c9a-f926-4ff0-a78a-239fae3968d5-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw\" (UID: \"00148c9a-f926-4ff0-a78a-239fae3968d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.757173 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45wvh\" (UniqueName: \"kubernetes.io/projected/00148c9a-f926-4ff0-a78a-239fae3968d5-kube-api-access-45wvh\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw\" (UID: \"00148c9a-f926-4ff0-a78a-239fae3968d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.757199 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00148c9a-f926-4ff0-a78a-239fae3968d5-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw\" (UID: \"00148c9a-f926-4ff0-a78a-239fae3968d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.858746 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00148c9a-f926-4ff0-a78a-239fae3968d5-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw\" (UID: \"00148c9a-f926-4ff0-a78a-239fae3968d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.858824 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/00148c9a-f926-4ff0-a78a-239fae3968d5-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw\" (UID: \"00148c9a-f926-4ff0-a78a-239fae3968d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.858871 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45wvh\" (UniqueName: \"kubernetes.io/projected/00148c9a-f926-4ff0-a78a-239fae3968d5-kube-api-access-45wvh\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw\" (UID: \"00148c9a-f926-4ff0-a78a-239fae3968d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.858891 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00148c9a-f926-4ff0-a78a-239fae3968d5-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw\" (UID: \"00148c9a-f926-4ff0-a78a-239fae3968d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.863878 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/00148c9a-f926-4ff0-a78a-239fae3968d5-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw\" (UID: \"00148c9a-f926-4ff0-a78a-239fae3968d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.864904 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00148c9a-f926-4ff0-a78a-239fae3968d5-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw\" (UID: \"00148c9a-f926-4ff0-a78a-239fae3968d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.866021 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00148c9a-f926-4ff0-a78a-239fae3968d5-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw\" (UID: \"00148c9a-f926-4ff0-a78a-239fae3968d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.881774 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45wvh\" (UniqueName: \"kubernetes.io/projected/00148c9a-f926-4ff0-a78a-239fae3968d5-kube-api-access-45wvh\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw\" (UID: \"00148c9a-f926-4ff0-a78a-239fae3968d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw" Oct 07 19:22:53 crc kubenswrapper[4825]: I1007 19:22:53.945857 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw" Oct 07 19:22:54 crc kubenswrapper[4825]: I1007 19:22:54.102345 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"18c777f8-aad0-482a-b132-ad417d64eb6e","Type":"ContainerStarted","Data":"dc3964b4403dce1bfb2fe68da6af456c8f36b65e0fa136db0cd39720c149ac35"} Oct 07 19:22:54 crc kubenswrapper[4825]: I1007 19:22:54.103918 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 07 19:22:54 crc kubenswrapper[4825]: I1007 19:22:54.139879 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.139864336 podStartE2EDuration="37.139864336s" podCreationTimestamp="2025-10-07 19:22:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:22:54.135136204 +0000 UTC m=+1362.957174841" watchObservedRunningTime="2025-10-07 19:22:54.139864336 +0000 UTC m=+1362.961902973" Oct 07 19:22:54 crc kubenswrapper[4825]: I1007 19:22:54.490501 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw"] Oct 07 19:22:54 crc kubenswrapper[4825]: I1007 19:22:54.498248 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 19:22:55 crc kubenswrapper[4825]: I1007 19:22:55.113301 4825 generic.go:334] "Generic (PLEG): container finished" podID="e773083b-ae36-44eb-bb82-18b12b504439" containerID="9b796568611911d3c7592d7ff963cb1852c68089a96641586d73d116ab1b337b" exitCode=0 Oct 07 19:22:55 crc kubenswrapper[4825]: I1007 19:22:55.113373 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e773083b-ae36-44eb-bb82-18b12b504439","Type":"ContainerDied","Data":"9b796568611911d3c7592d7ff963cb1852c68089a96641586d73d116ab1b337b"} Oct 07 19:22:55 crc kubenswrapper[4825]: I1007 19:22:55.116071 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw" event={"ID":"00148c9a-f926-4ff0-a78a-239fae3968d5","Type":"ContainerStarted","Data":"fd32850185c21d30e2545a17019ed293eb903440f0b4511bc35d89fc8087c720"} Oct 07 19:22:56 crc kubenswrapper[4825]: I1007 19:22:56.139116 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e773083b-ae36-44eb-bb82-18b12b504439","Type":"ContainerStarted","Data":"89e034c635912b236f396c79718f0a60f009dd8deaea85069e689a59fc6c1a67"} Oct 07 19:22:56 crc kubenswrapper[4825]: I1007 19:22:56.139995 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:22:56 crc kubenswrapper[4825]: I1007 19:22:56.161965 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.161951393 podStartE2EDuration="38.161951393s" podCreationTimestamp="2025-10-07 19:22:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:22:56.161666774 +0000 UTC m=+1364.983705441" watchObservedRunningTime="2025-10-07 19:22:56.161951393 +0000 UTC m=+1364.983990030" Oct 07 19:23:05 crc kubenswrapper[4825]: I1007 19:23:05.709122 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:23:05 crc kubenswrapper[4825]: I1007 19:23:05.709726 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:23:07 crc kubenswrapper[4825]: I1007 19:23:07.256671 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw" event={"ID":"00148c9a-f926-4ff0-a78a-239fae3968d5","Type":"ContainerStarted","Data":"6b821f9f5d1fe487db882a3052fa94f827d1a20420a6cd48af6606137802d39f"} Oct 07 19:23:08 crc kubenswrapper[4825]: I1007 19:23:08.027404 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 07 19:23:08 crc kubenswrapper[4825]: I1007 19:23:08.078623 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw" podStartSLOduration=3.176084017 podStartE2EDuration="15.078592422s" podCreationTimestamp="2025-10-07 19:22:53 +0000 UTC" firstStartedPulling="2025-10-07 19:22:54.498037994 +0000 UTC m=+1363.320076631" lastFinishedPulling="2025-10-07 19:23:06.400546369 +0000 UTC m=+1375.222585036" observedRunningTime="2025-10-07 19:23:07.279614582 +0000 UTC m=+1376.101653229" watchObservedRunningTime="2025-10-07 19:23:08.078592422 +0000 UTC m=+1376.900631089" Oct 07 19:23:09 crc kubenswrapper[4825]: I1007 19:23:09.026500 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 07 19:23:17 crc kubenswrapper[4825]: I1007 19:23:17.370277 4825 generic.go:334] "Generic (PLEG): container finished" podID="00148c9a-f926-4ff0-a78a-239fae3968d5" containerID="6b821f9f5d1fe487db882a3052fa94f827d1a20420a6cd48af6606137802d39f" exitCode=0 Oct 07 19:23:17 crc kubenswrapper[4825]: I1007 19:23:17.370300 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw" event={"ID":"00148c9a-f926-4ff0-a78a-239fae3968d5","Type":"ContainerDied","Data":"6b821f9f5d1fe487db882a3052fa94f827d1a20420a6cd48af6606137802d39f"} Oct 07 19:23:18 crc kubenswrapper[4825]: I1007 19:23:18.827990 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw" Oct 07 19:23:18 crc kubenswrapper[4825]: I1007 19:23:18.959762 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/00148c9a-f926-4ff0-a78a-239fae3968d5-ssh-key\") pod \"00148c9a-f926-4ff0-a78a-239fae3968d5\" (UID: \"00148c9a-f926-4ff0-a78a-239fae3968d5\") " Oct 07 19:23:18 crc kubenswrapper[4825]: I1007 19:23:18.959841 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00148c9a-f926-4ff0-a78a-239fae3968d5-repo-setup-combined-ca-bundle\") pod \"00148c9a-f926-4ff0-a78a-239fae3968d5\" (UID: \"00148c9a-f926-4ff0-a78a-239fae3968d5\") " Oct 07 19:23:18 crc kubenswrapper[4825]: I1007 19:23:18.959901 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45wvh\" (UniqueName: \"kubernetes.io/projected/00148c9a-f926-4ff0-a78a-239fae3968d5-kube-api-access-45wvh\") pod \"00148c9a-f926-4ff0-a78a-239fae3968d5\" (UID: \"00148c9a-f926-4ff0-a78a-239fae3968d5\") " Oct 07 19:23:18 crc kubenswrapper[4825]: I1007 19:23:18.959937 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00148c9a-f926-4ff0-a78a-239fae3968d5-inventory\") pod \"00148c9a-f926-4ff0-a78a-239fae3968d5\" (UID: \"00148c9a-f926-4ff0-a78a-239fae3968d5\") " Oct 07 19:23:18 crc kubenswrapper[4825]: I1007 19:23:18.966546 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00148c9a-f926-4ff0-a78a-239fae3968d5-kube-api-access-45wvh" (OuterVolumeSpecName: "kube-api-access-45wvh") pod "00148c9a-f926-4ff0-a78a-239fae3968d5" (UID: "00148c9a-f926-4ff0-a78a-239fae3968d5"). InnerVolumeSpecName "kube-api-access-45wvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:23:18 crc kubenswrapper[4825]: I1007 19:23:18.967590 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00148c9a-f926-4ff0-a78a-239fae3968d5-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "00148c9a-f926-4ff0-a78a-239fae3968d5" (UID: "00148c9a-f926-4ff0-a78a-239fae3968d5"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.010729 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00148c9a-f926-4ff0-a78a-239fae3968d5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "00148c9a-f926-4ff0-a78a-239fae3968d5" (UID: "00148c9a-f926-4ff0-a78a-239fae3968d5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.010966 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00148c9a-f926-4ff0-a78a-239fae3968d5-inventory" (OuterVolumeSpecName: "inventory") pod "00148c9a-f926-4ff0-a78a-239fae3968d5" (UID: "00148c9a-f926-4ff0-a78a-239fae3968d5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.063597 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/00148c9a-f926-4ff0-a78a-239fae3968d5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.063648 4825 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00148c9a-f926-4ff0-a78a-239fae3968d5-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.063672 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45wvh\" (UniqueName: \"kubernetes.io/projected/00148c9a-f926-4ff0-a78a-239fae3968d5-kube-api-access-45wvh\") on node \"crc\" DevicePath \"\"" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.063693 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00148c9a-f926-4ff0-a78a-239fae3968d5-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.401171 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw" event={"ID":"00148c9a-f926-4ff0-a78a-239fae3968d5","Type":"ContainerDied","Data":"fd32850185c21d30e2545a17019ed293eb903440f0b4511bc35d89fc8087c720"} Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.401223 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd32850185c21d30e2545a17019ed293eb903440f0b4511bc35d89fc8087c720" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.401271 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.510170 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-q2wcd"] Oct 07 19:23:19 crc kubenswrapper[4825]: E1007 19:23:19.510853 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00148c9a-f926-4ff0-a78a-239fae3968d5" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.510887 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="00148c9a-f926-4ff0-a78a-239fae3968d5" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.511282 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="00148c9a-f926-4ff0-a78a-239fae3968d5" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.512306 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q2wcd" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.515364 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lr8sm" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.517654 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.518018 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.522782 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.523713 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-q2wcd"] Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.573724 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x4j4\" (UniqueName: \"kubernetes.io/projected/7f652c6b-fc94-47dc-90ec-a19d7e49d728-kube-api-access-4x4j4\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q2wcd\" (UID: \"7f652c6b-fc94-47dc-90ec-a19d7e49d728\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q2wcd" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.573947 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7f652c6b-fc94-47dc-90ec-a19d7e49d728-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q2wcd\" (UID: \"7f652c6b-fc94-47dc-90ec-a19d7e49d728\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q2wcd" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.574107 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f652c6b-fc94-47dc-90ec-a19d7e49d728-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q2wcd\" (UID: \"7f652c6b-fc94-47dc-90ec-a19d7e49d728\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q2wcd" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.674943 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7f652c6b-fc94-47dc-90ec-a19d7e49d728-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q2wcd\" (UID: \"7f652c6b-fc94-47dc-90ec-a19d7e49d728\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q2wcd" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.675046 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f652c6b-fc94-47dc-90ec-a19d7e49d728-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q2wcd\" (UID: \"7f652c6b-fc94-47dc-90ec-a19d7e49d728\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q2wcd" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.675132 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x4j4\" (UniqueName: \"kubernetes.io/projected/7f652c6b-fc94-47dc-90ec-a19d7e49d728-kube-api-access-4x4j4\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q2wcd\" (UID: \"7f652c6b-fc94-47dc-90ec-a19d7e49d728\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q2wcd" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.683957 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f652c6b-fc94-47dc-90ec-a19d7e49d728-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q2wcd\" (UID: \"7f652c6b-fc94-47dc-90ec-a19d7e49d728\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q2wcd" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.687929 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7f652c6b-fc94-47dc-90ec-a19d7e49d728-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q2wcd\" (UID: \"7f652c6b-fc94-47dc-90ec-a19d7e49d728\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q2wcd" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.702456 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x4j4\" (UniqueName: \"kubernetes.io/projected/7f652c6b-fc94-47dc-90ec-a19d7e49d728-kube-api-access-4x4j4\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q2wcd\" (UID: \"7f652c6b-fc94-47dc-90ec-a19d7e49d728\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q2wcd" Oct 07 19:23:19 crc kubenswrapper[4825]: I1007 19:23:19.837850 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q2wcd" Oct 07 19:23:20 crc kubenswrapper[4825]: I1007 19:23:20.427784 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-q2wcd"] Oct 07 19:23:21 crc kubenswrapper[4825]: I1007 19:23:21.427342 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q2wcd" event={"ID":"7f652c6b-fc94-47dc-90ec-a19d7e49d728","Type":"ContainerStarted","Data":"19d941f664b2f28f53e9d328dd0144388532f984123cd1f25ec9aeb1560d5b21"} Oct 07 19:23:21 crc kubenswrapper[4825]: I1007 19:23:21.427664 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q2wcd" event={"ID":"7f652c6b-fc94-47dc-90ec-a19d7e49d728","Type":"ContainerStarted","Data":"56888d4d497f066cc40115389ae40b701e3047b242f32b3dac3c8e526225d847"} Oct 07 19:23:21 crc kubenswrapper[4825]: I1007 19:23:21.471774 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q2wcd" podStartSLOduration=2.27548599 podStartE2EDuration="2.471743408s" podCreationTimestamp="2025-10-07 19:23:19 +0000 UTC" firstStartedPulling="2025-10-07 19:23:20.433071709 +0000 UTC m=+1389.255110356" lastFinishedPulling="2025-10-07 19:23:20.629329127 +0000 UTC m=+1389.451367774" observedRunningTime="2025-10-07 19:23:21.452747726 +0000 UTC m=+1390.274786373" watchObservedRunningTime="2025-10-07 19:23:21.471743408 +0000 UTC m=+1390.293782085" Oct 07 19:23:23 crc kubenswrapper[4825]: I1007 19:23:23.452068 4825 generic.go:334] "Generic (PLEG): container finished" podID="7f652c6b-fc94-47dc-90ec-a19d7e49d728" containerID="19d941f664b2f28f53e9d328dd0144388532f984123cd1f25ec9aeb1560d5b21" exitCode=0 Oct 07 19:23:23 crc kubenswrapper[4825]: I1007 19:23:23.452299 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q2wcd" event={"ID":"7f652c6b-fc94-47dc-90ec-a19d7e49d728","Type":"ContainerDied","Data":"19d941f664b2f28f53e9d328dd0144388532f984123cd1f25ec9aeb1560d5b21"} Oct 07 19:23:24 crc kubenswrapper[4825]: I1007 19:23:24.971950 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q2wcd" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.085158 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x4j4\" (UniqueName: \"kubernetes.io/projected/7f652c6b-fc94-47dc-90ec-a19d7e49d728-kube-api-access-4x4j4\") pod \"7f652c6b-fc94-47dc-90ec-a19d7e49d728\" (UID: \"7f652c6b-fc94-47dc-90ec-a19d7e49d728\") " Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.085288 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7f652c6b-fc94-47dc-90ec-a19d7e49d728-ssh-key\") pod \"7f652c6b-fc94-47dc-90ec-a19d7e49d728\" (UID: \"7f652c6b-fc94-47dc-90ec-a19d7e49d728\") " Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.085459 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f652c6b-fc94-47dc-90ec-a19d7e49d728-inventory\") pod \"7f652c6b-fc94-47dc-90ec-a19d7e49d728\" (UID: \"7f652c6b-fc94-47dc-90ec-a19d7e49d728\") " Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.090625 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f652c6b-fc94-47dc-90ec-a19d7e49d728-kube-api-access-4x4j4" (OuterVolumeSpecName: "kube-api-access-4x4j4") pod "7f652c6b-fc94-47dc-90ec-a19d7e49d728" (UID: "7f652c6b-fc94-47dc-90ec-a19d7e49d728"). InnerVolumeSpecName "kube-api-access-4x4j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.122885 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f652c6b-fc94-47dc-90ec-a19d7e49d728-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7f652c6b-fc94-47dc-90ec-a19d7e49d728" (UID: "7f652c6b-fc94-47dc-90ec-a19d7e49d728"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.130783 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f652c6b-fc94-47dc-90ec-a19d7e49d728-inventory" (OuterVolumeSpecName: "inventory") pod "7f652c6b-fc94-47dc-90ec-a19d7e49d728" (UID: "7f652c6b-fc94-47dc-90ec-a19d7e49d728"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.187484 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7f652c6b-fc94-47dc-90ec-a19d7e49d728-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.187520 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f652c6b-fc94-47dc-90ec-a19d7e49d728-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.187530 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x4j4\" (UniqueName: \"kubernetes.io/projected/7f652c6b-fc94-47dc-90ec-a19d7e49d728-kube-api-access-4x4j4\") on node \"crc\" DevicePath \"\"" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.483876 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q2wcd" event={"ID":"7f652c6b-fc94-47dc-90ec-a19d7e49d728","Type":"ContainerDied","Data":"56888d4d497f066cc40115389ae40b701e3047b242f32b3dac3c8e526225d847"} Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.483956 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56888d4d497f066cc40115389ae40b701e3047b242f32b3dac3c8e526225d847" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.483920 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q2wcd" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.588189 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2"] Oct 07 19:23:25 crc kubenswrapper[4825]: E1007 19:23:25.589287 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f652c6b-fc94-47dc-90ec-a19d7e49d728" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.589317 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f652c6b-fc94-47dc-90ec-a19d7e49d728" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.589631 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f652c6b-fc94-47dc-90ec-a19d7e49d728" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.590675 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.594909 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.595043 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.595191 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.596204 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lr8sm" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.618349 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2"] Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.698673 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ede848ae-130b-4c5c-a4fb-873d9ea65cb6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2\" (UID: \"ede848ae-130b-4c5c-a4fb-873d9ea65cb6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.698728 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt9j2\" (UniqueName: \"kubernetes.io/projected/ede848ae-130b-4c5c-a4fb-873d9ea65cb6-kube-api-access-vt9j2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2\" (UID: \"ede848ae-130b-4c5c-a4fb-873d9ea65cb6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.698747 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ede848ae-130b-4c5c-a4fb-873d9ea65cb6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2\" (UID: \"ede848ae-130b-4c5c-a4fb-873d9ea65cb6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.698774 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ede848ae-130b-4c5c-a4fb-873d9ea65cb6-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2\" (UID: \"ede848ae-130b-4c5c-a4fb-873d9ea65cb6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.800528 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ede848ae-130b-4c5c-a4fb-873d9ea65cb6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2\" (UID: \"ede848ae-130b-4c5c-a4fb-873d9ea65cb6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.800597 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt9j2\" (UniqueName: \"kubernetes.io/projected/ede848ae-130b-4c5c-a4fb-873d9ea65cb6-kube-api-access-vt9j2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2\" (UID: \"ede848ae-130b-4c5c-a4fb-873d9ea65cb6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.800639 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ede848ae-130b-4c5c-a4fb-873d9ea65cb6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2\" (UID: \"ede848ae-130b-4c5c-a4fb-873d9ea65cb6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.800723 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ede848ae-130b-4c5c-a4fb-873d9ea65cb6-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2\" (UID: \"ede848ae-130b-4c5c-a4fb-873d9ea65cb6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.807247 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ede848ae-130b-4c5c-a4fb-873d9ea65cb6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2\" (UID: \"ede848ae-130b-4c5c-a4fb-873d9ea65cb6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.807776 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ede848ae-130b-4c5c-a4fb-873d9ea65cb6-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2\" (UID: \"ede848ae-130b-4c5c-a4fb-873d9ea65cb6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.808120 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ede848ae-130b-4c5c-a4fb-873d9ea65cb6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2\" (UID: \"ede848ae-130b-4c5c-a4fb-873d9ea65cb6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.821830 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt9j2\" (UniqueName: \"kubernetes.io/projected/ede848ae-130b-4c5c-a4fb-873d9ea65cb6-kube-api-access-vt9j2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2\" (UID: \"ede848ae-130b-4c5c-a4fb-873d9ea65cb6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2" Oct 07 19:23:25 crc kubenswrapper[4825]: I1007 19:23:25.933510 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2" Oct 07 19:23:26 crc kubenswrapper[4825]: I1007 19:23:26.490521 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2"] Oct 07 19:23:26 crc kubenswrapper[4825]: W1007 19:23:26.494412 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podede848ae_130b_4c5c_a4fb_873d9ea65cb6.slice/crio-8a2aa17271b3e6f27e7cc554523d84dfd72a0e1889be4d36882ffcd5c1f2c5ff WatchSource:0}: Error finding container 8a2aa17271b3e6f27e7cc554523d84dfd72a0e1889be4d36882ffcd5c1f2c5ff: Status 404 returned error can't find the container with id 8a2aa17271b3e6f27e7cc554523d84dfd72a0e1889be4d36882ffcd5c1f2c5ff Oct 07 19:23:27 crc kubenswrapper[4825]: I1007 19:23:27.504601 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2" event={"ID":"ede848ae-130b-4c5c-a4fb-873d9ea65cb6","Type":"ContainerStarted","Data":"1efabbfa0b707b79b34d2b27c971f057d921c1894a166720a2784adf6f2e5446"} Oct 07 19:23:27 crc kubenswrapper[4825]: I1007 19:23:27.504893 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2" event={"ID":"ede848ae-130b-4c5c-a4fb-873d9ea65cb6","Type":"ContainerStarted","Data":"8a2aa17271b3e6f27e7cc554523d84dfd72a0e1889be4d36882ffcd5c1f2c5ff"} Oct 07 19:23:27 crc kubenswrapper[4825]: I1007 19:23:27.533019 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2" podStartSLOduration=2.350017838 podStartE2EDuration="2.532985426s" podCreationTimestamp="2025-10-07 19:23:25 +0000 UTC" firstStartedPulling="2025-10-07 19:23:26.499160174 +0000 UTC m=+1395.321198841" lastFinishedPulling="2025-10-07 19:23:26.682127752 +0000 UTC m=+1395.504166429" observedRunningTime="2025-10-07 19:23:27.517153126 +0000 UTC m=+1396.339191763" watchObservedRunningTime="2025-10-07 19:23:27.532985426 +0000 UTC m=+1396.355024073" Oct 07 19:23:35 crc kubenswrapper[4825]: I1007 19:23:35.709373 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:23:35 crc kubenswrapper[4825]: I1007 19:23:35.710437 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:23:36 crc kubenswrapper[4825]: I1007 19:23:36.325619 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xsj9q"] Oct 07 19:23:36 crc kubenswrapper[4825]: I1007 19:23:36.327570 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xsj9q" Oct 07 19:23:36 crc kubenswrapper[4825]: I1007 19:23:36.354780 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xsj9q"] Oct 07 19:23:36 crc kubenswrapper[4825]: I1007 19:23:36.423569 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0-utilities\") pod \"community-operators-xsj9q\" (UID: \"0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0\") " pod="openshift-marketplace/community-operators-xsj9q" Oct 07 19:23:36 crc kubenswrapper[4825]: I1007 19:23:36.423639 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7lff\" (UniqueName: \"kubernetes.io/projected/0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0-kube-api-access-h7lff\") pod \"community-operators-xsj9q\" (UID: \"0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0\") " pod="openshift-marketplace/community-operators-xsj9q" Oct 07 19:23:36 crc kubenswrapper[4825]: I1007 19:23:36.423837 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0-catalog-content\") pod \"community-operators-xsj9q\" (UID: \"0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0\") " pod="openshift-marketplace/community-operators-xsj9q" Oct 07 19:23:36 crc kubenswrapper[4825]: I1007 19:23:36.525282 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0-catalog-content\") pod \"community-operators-xsj9q\" (UID: \"0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0\") " pod="openshift-marketplace/community-operators-xsj9q" Oct 07 19:23:36 crc kubenswrapper[4825]: I1007 19:23:36.525559 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0-utilities\") pod \"community-operators-xsj9q\" (UID: \"0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0\") " pod="openshift-marketplace/community-operators-xsj9q" Oct 07 19:23:36 crc kubenswrapper[4825]: I1007 19:23:36.525613 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7lff\" (UniqueName: \"kubernetes.io/projected/0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0-kube-api-access-h7lff\") pod \"community-operators-xsj9q\" (UID: \"0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0\") " pod="openshift-marketplace/community-operators-xsj9q" Oct 07 19:23:36 crc kubenswrapper[4825]: I1007 19:23:36.525921 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0-utilities\") pod \"community-operators-xsj9q\" (UID: \"0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0\") " pod="openshift-marketplace/community-operators-xsj9q" Oct 07 19:23:36 crc kubenswrapper[4825]: I1007 19:23:36.525924 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0-catalog-content\") pod \"community-operators-xsj9q\" (UID: \"0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0\") " pod="openshift-marketplace/community-operators-xsj9q" Oct 07 19:23:36 crc kubenswrapper[4825]: I1007 19:23:36.544090 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7lff\" (UniqueName: \"kubernetes.io/projected/0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0-kube-api-access-h7lff\") pod \"community-operators-xsj9q\" (UID: \"0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0\") " pod="openshift-marketplace/community-operators-xsj9q" Oct 07 19:23:36 crc kubenswrapper[4825]: I1007 19:23:36.650718 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xsj9q" Oct 07 19:23:37 crc kubenswrapper[4825]: I1007 19:23:37.263297 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xsj9q"] Oct 07 19:23:37 crc kubenswrapper[4825]: I1007 19:23:37.622789 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsj9q" event={"ID":"0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0","Type":"ContainerStarted","Data":"3b18a778de6850f71a8e61e08a050b9bb3e16fc846e9699e17a48a4ef4e95ff2"} Oct 07 19:23:37 crc kubenswrapper[4825]: I1007 19:23:37.623048 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsj9q" event={"ID":"0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0","Type":"ContainerStarted","Data":"c70e53d0c9a6985bf215a0da55525ccbb6bb1bdc581638e9c1f129c5c35a9db5"} Oct 07 19:23:38 crc kubenswrapper[4825]: I1007 19:23:38.635965 4825 generic.go:334] "Generic (PLEG): container finished" podID="0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0" containerID="3b18a778de6850f71a8e61e08a050b9bb3e16fc846e9699e17a48a4ef4e95ff2" exitCode=0 Oct 07 19:23:38 crc kubenswrapper[4825]: I1007 19:23:38.636049 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsj9q" event={"ID":"0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0","Type":"ContainerDied","Data":"3b18a778de6850f71a8e61e08a050b9bb3e16fc846e9699e17a48a4ef4e95ff2"} Oct 07 19:23:38 crc kubenswrapper[4825]: I1007 19:23:38.636119 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsj9q" event={"ID":"0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0","Type":"ContainerStarted","Data":"df8a1ae1a0735f30661f25a68bf1e9042dde6795a9eaeecbd590c1447b0c09e7"} Oct 07 19:23:39 crc kubenswrapper[4825]: I1007 19:23:39.653790 4825 generic.go:334] "Generic (PLEG): container finished" podID="0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0" containerID="df8a1ae1a0735f30661f25a68bf1e9042dde6795a9eaeecbd590c1447b0c09e7" exitCode=0 Oct 07 19:23:39 crc kubenswrapper[4825]: I1007 19:23:39.653934 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsj9q" event={"ID":"0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0","Type":"ContainerDied","Data":"df8a1ae1a0735f30661f25a68bf1e9042dde6795a9eaeecbd590c1447b0c09e7"} Oct 07 19:23:40 crc kubenswrapper[4825]: I1007 19:23:40.674863 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsj9q" event={"ID":"0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0","Type":"ContainerStarted","Data":"6c64af9ceba1399f88e07cc43973a8bfbf26e4c957a4169700342f5f512a976f"} Oct 07 19:23:40 crc kubenswrapper[4825]: I1007 19:23:40.707967 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xsj9q" podStartSLOduration=2.114389585 podStartE2EDuration="4.707946737s" podCreationTimestamp="2025-10-07 19:23:36 +0000 UTC" firstStartedPulling="2025-10-07 19:23:37.624548152 +0000 UTC m=+1406.446586829" lastFinishedPulling="2025-10-07 19:23:40.218105344 +0000 UTC m=+1409.040143981" observedRunningTime="2025-10-07 19:23:40.701643285 +0000 UTC m=+1409.523681952" watchObservedRunningTime="2025-10-07 19:23:40.707946737 +0000 UTC m=+1409.529985384" Oct 07 19:23:44 crc kubenswrapper[4825]: I1007 19:23:44.624061 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h2l4w"] Oct 07 19:23:44 crc kubenswrapper[4825]: I1007 19:23:44.627541 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h2l4w" Oct 07 19:23:44 crc kubenswrapper[4825]: I1007 19:23:44.642961 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h2l4w"] Oct 07 19:23:44 crc kubenswrapper[4825]: I1007 19:23:44.802294 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8llmf\" (UniqueName: \"kubernetes.io/projected/88cc094b-caeb-4d99-a729-1de502f39008-kube-api-access-8llmf\") pod \"certified-operators-h2l4w\" (UID: \"88cc094b-caeb-4d99-a729-1de502f39008\") " pod="openshift-marketplace/certified-operators-h2l4w" Oct 07 19:23:44 crc kubenswrapper[4825]: I1007 19:23:44.802360 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88cc094b-caeb-4d99-a729-1de502f39008-utilities\") pod \"certified-operators-h2l4w\" (UID: \"88cc094b-caeb-4d99-a729-1de502f39008\") " pod="openshift-marketplace/certified-operators-h2l4w" Oct 07 19:23:44 crc kubenswrapper[4825]: I1007 19:23:44.802579 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88cc094b-caeb-4d99-a729-1de502f39008-catalog-content\") pod \"certified-operators-h2l4w\" (UID: \"88cc094b-caeb-4d99-a729-1de502f39008\") " pod="openshift-marketplace/certified-operators-h2l4w" Oct 07 19:23:44 crc kubenswrapper[4825]: I1007 19:23:44.904888 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8llmf\" (UniqueName: \"kubernetes.io/projected/88cc094b-caeb-4d99-a729-1de502f39008-kube-api-access-8llmf\") pod \"certified-operators-h2l4w\" (UID: \"88cc094b-caeb-4d99-a729-1de502f39008\") " pod="openshift-marketplace/certified-operators-h2l4w" Oct 07 19:23:44 crc kubenswrapper[4825]: I1007 19:23:44.904973 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88cc094b-caeb-4d99-a729-1de502f39008-utilities\") pod \"certified-operators-h2l4w\" (UID: \"88cc094b-caeb-4d99-a729-1de502f39008\") " pod="openshift-marketplace/certified-operators-h2l4w" Oct 07 19:23:44 crc kubenswrapper[4825]: I1007 19:23:44.905092 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88cc094b-caeb-4d99-a729-1de502f39008-catalog-content\") pod \"certified-operators-h2l4w\" (UID: \"88cc094b-caeb-4d99-a729-1de502f39008\") " pod="openshift-marketplace/certified-operators-h2l4w" Oct 07 19:23:44 crc kubenswrapper[4825]: I1007 19:23:44.905609 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88cc094b-caeb-4d99-a729-1de502f39008-utilities\") pod \"certified-operators-h2l4w\" (UID: \"88cc094b-caeb-4d99-a729-1de502f39008\") " pod="openshift-marketplace/certified-operators-h2l4w" Oct 07 19:23:44 crc kubenswrapper[4825]: I1007 19:23:44.905911 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88cc094b-caeb-4d99-a729-1de502f39008-catalog-content\") pod \"certified-operators-h2l4w\" (UID: \"88cc094b-caeb-4d99-a729-1de502f39008\") " pod="openshift-marketplace/certified-operators-h2l4w" Oct 07 19:23:44 crc kubenswrapper[4825]: I1007 19:23:44.931065 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8llmf\" (UniqueName: \"kubernetes.io/projected/88cc094b-caeb-4d99-a729-1de502f39008-kube-api-access-8llmf\") pod \"certified-operators-h2l4w\" (UID: \"88cc094b-caeb-4d99-a729-1de502f39008\") " pod="openshift-marketplace/certified-operators-h2l4w" Oct 07 19:23:44 crc kubenswrapper[4825]: I1007 19:23:44.949723 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h2l4w" Oct 07 19:23:45 crc kubenswrapper[4825]: W1007 19:23:45.459141 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88cc094b_caeb_4d99_a729_1de502f39008.slice/crio-91039d3236a2292a3d950ddd016abc021ef22a9197519df4bd9fad41b0e56e30 WatchSource:0}: Error finding container 91039d3236a2292a3d950ddd016abc021ef22a9197519df4bd9fad41b0e56e30: Status 404 returned error can't find the container with id 91039d3236a2292a3d950ddd016abc021ef22a9197519df4bd9fad41b0e56e30 Oct 07 19:23:45 crc kubenswrapper[4825]: I1007 19:23:45.470103 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h2l4w"] Oct 07 19:23:45 crc kubenswrapper[4825]: I1007 19:23:45.731282 4825 generic.go:334] "Generic (PLEG): container finished" podID="88cc094b-caeb-4d99-a729-1de502f39008" containerID="416dbaad9ae190db6acf35be5b44587e08f821cd9e20b31a68cc6565fae49655" exitCode=0 Oct 07 19:23:45 crc kubenswrapper[4825]: I1007 19:23:45.731428 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2l4w" event={"ID":"88cc094b-caeb-4d99-a729-1de502f39008","Type":"ContainerDied","Data":"416dbaad9ae190db6acf35be5b44587e08f821cd9e20b31a68cc6565fae49655"} Oct 07 19:23:45 crc kubenswrapper[4825]: I1007 19:23:45.731549 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2l4w" event={"ID":"88cc094b-caeb-4d99-a729-1de502f39008","Type":"ContainerStarted","Data":"91039d3236a2292a3d950ddd016abc021ef22a9197519df4bd9fad41b0e56e30"} Oct 07 19:23:46 crc kubenswrapper[4825]: I1007 19:23:46.306907 4825 scope.go:117] "RemoveContainer" containerID="6e9078d9dd0b938b3836bba75f2bd89a5c00e0bcaec1de46c704b6a36503a827" Oct 07 19:23:46 crc kubenswrapper[4825]: I1007 19:23:46.329425 4825 scope.go:117] "RemoveContainer" containerID="9bb596719b7860d14cea532bbcf0e4c41a9d1fc0bb2dd5474cf0b63c324a1ce9" Oct 07 19:23:46 crc kubenswrapper[4825]: I1007 19:23:46.419438 4825 scope.go:117] "RemoveContainer" containerID="88747c9b773efbe2d6d1a5ee5b29480b3001c19bb98e79aa140e3b86e63dc2ff" Oct 07 19:23:46 crc kubenswrapper[4825]: I1007 19:23:46.461120 4825 scope.go:117] "RemoveContainer" containerID="3b84d0931612a1e93c4132303b40b31f1cfc646d94147f505854b40b58abb5a7" Oct 07 19:23:46 crc kubenswrapper[4825]: I1007 19:23:46.651128 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xsj9q" Oct 07 19:23:46 crc kubenswrapper[4825]: I1007 19:23:46.651258 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xsj9q" Oct 07 19:23:46 crc kubenswrapper[4825]: I1007 19:23:46.722524 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xsj9q" Oct 07 19:23:46 crc kubenswrapper[4825]: I1007 19:23:46.800531 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xsj9q" Oct 07 19:23:47 crc kubenswrapper[4825]: I1007 19:23:47.752640 4825 generic.go:334] "Generic (PLEG): container finished" podID="88cc094b-caeb-4d99-a729-1de502f39008" containerID="4df04d5ad47efe25ca1c21adf98be4550ec535d8f57261de7ac3d1e3cfc27345" exitCode=0 Oct 07 19:23:47 crc kubenswrapper[4825]: I1007 19:23:47.752708 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2l4w" event={"ID":"88cc094b-caeb-4d99-a729-1de502f39008","Type":"ContainerDied","Data":"4df04d5ad47efe25ca1c21adf98be4550ec535d8f57261de7ac3d1e3cfc27345"} Oct 07 19:23:48 crc kubenswrapper[4825]: I1007 19:23:48.193578 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xsj9q"] Oct 07 19:23:48 crc kubenswrapper[4825]: I1007 19:23:48.773247 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2l4w" event={"ID":"88cc094b-caeb-4d99-a729-1de502f39008","Type":"ContainerStarted","Data":"f7bbb91024c501d958c73d081d2ced563a9cbc5a65d65e16cf50a887bef85908"} Oct 07 19:23:48 crc kubenswrapper[4825]: I1007 19:23:48.811875 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h2l4w" podStartSLOduration=2.327932348 podStartE2EDuration="4.811854626s" podCreationTimestamp="2025-10-07 19:23:44 +0000 UTC" firstStartedPulling="2025-10-07 19:23:45.733278245 +0000 UTC m=+1414.555316872" lastFinishedPulling="2025-10-07 19:23:48.217200493 +0000 UTC m=+1417.039239150" observedRunningTime="2025-10-07 19:23:48.80950803 +0000 UTC m=+1417.631546727" watchObservedRunningTime="2025-10-07 19:23:48.811854626 +0000 UTC m=+1417.633893283" Oct 07 19:23:49 crc kubenswrapper[4825]: I1007 19:23:49.781792 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xsj9q" podUID="0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0" containerName="registry-server" containerID="cri-o://6c64af9ceba1399f88e07cc43973a8bfbf26e4c957a4169700342f5f512a976f" gracePeriod=2 Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.251216 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xsj9q" Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.317328 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0-catalog-content\") pod \"0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0\" (UID: \"0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0\") " Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.317438 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7lff\" (UniqueName: \"kubernetes.io/projected/0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0-kube-api-access-h7lff\") pod \"0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0\" (UID: \"0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0\") " Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.317495 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0-utilities\") pod \"0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0\" (UID: \"0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0\") " Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.318151 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0-utilities" (OuterVolumeSpecName: "utilities") pod "0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0" (UID: "0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.324194 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0-kube-api-access-h7lff" (OuterVolumeSpecName: "kube-api-access-h7lff") pod "0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0" (UID: "0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0"). InnerVolumeSpecName "kube-api-access-h7lff". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.360811 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0" (UID: "0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.419697 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.419724 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7lff\" (UniqueName: \"kubernetes.io/projected/0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0-kube-api-access-h7lff\") on node \"crc\" DevicePath \"\"" Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.419734 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.799463 4825 generic.go:334] "Generic (PLEG): container finished" podID="0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0" containerID="6c64af9ceba1399f88e07cc43973a8bfbf26e4c957a4169700342f5f512a976f" exitCode=0 Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.799539 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsj9q" event={"ID":"0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0","Type":"ContainerDied","Data":"6c64af9ceba1399f88e07cc43973a8bfbf26e4c957a4169700342f5f512a976f"} Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.799581 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsj9q" event={"ID":"0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0","Type":"ContainerDied","Data":"c70e53d0c9a6985bf215a0da55525ccbb6bb1bdc581638e9c1f129c5c35a9db5"} Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.799612 4825 scope.go:117] "RemoveContainer" containerID="6c64af9ceba1399f88e07cc43973a8bfbf26e4c957a4169700342f5f512a976f" Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.799874 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xsj9q" Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.832863 4825 scope.go:117] "RemoveContainer" containerID="df8a1ae1a0735f30661f25a68bf1e9042dde6795a9eaeecbd590c1447b0c09e7" Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.866454 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xsj9q"] Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.873212 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xsj9q"] Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.876858 4825 scope.go:117] "RemoveContainer" containerID="3b18a778de6850f71a8e61e08a050b9bb3e16fc846e9699e17a48a4ef4e95ff2" Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.897841 4825 scope.go:117] "RemoveContainer" containerID="6c64af9ceba1399f88e07cc43973a8bfbf26e4c957a4169700342f5f512a976f" Oct 07 19:23:50 crc kubenswrapper[4825]: E1007 19:23:50.898615 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c64af9ceba1399f88e07cc43973a8bfbf26e4c957a4169700342f5f512a976f\": container with ID starting with 6c64af9ceba1399f88e07cc43973a8bfbf26e4c957a4169700342f5f512a976f not found: ID does not exist" containerID="6c64af9ceba1399f88e07cc43973a8bfbf26e4c957a4169700342f5f512a976f" Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.898703 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c64af9ceba1399f88e07cc43973a8bfbf26e4c957a4169700342f5f512a976f"} err="failed to get container status \"6c64af9ceba1399f88e07cc43973a8bfbf26e4c957a4169700342f5f512a976f\": rpc error: code = NotFound desc = could not find container \"6c64af9ceba1399f88e07cc43973a8bfbf26e4c957a4169700342f5f512a976f\": container with ID starting with 6c64af9ceba1399f88e07cc43973a8bfbf26e4c957a4169700342f5f512a976f not found: ID does not exist" Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.898744 4825 scope.go:117] "RemoveContainer" containerID="df8a1ae1a0735f30661f25a68bf1e9042dde6795a9eaeecbd590c1447b0c09e7" Oct 07 19:23:50 crc kubenswrapper[4825]: E1007 19:23:50.899132 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df8a1ae1a0735f30661f25a68bf1e9042dde6795a9eaeecbd590c1447b0c09e7\": container with ID starting with df8a1ae1a0735f30661f25a68bf1e9042dde6795a9eaeecbd590c1447b0c09e7 not found: ID does not exist" containerID="df8a1ae1a0735f30661f25a68bf1e9042dde6795a9eaeecbd590c1447b0c09e7" Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.899198 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df8a1ae1a0735f30661f25a68bf1e9042dde6795a9eaeecbd590c1447b0c09e7"} err="failed to get container status \"df8a1ae1a0735f30661f25a68bf1e9042dde6795a9eaeecbd590c1447b0c09e7\": rpc error: code = NotFound desc = could not find container \"df8a1ae1a0735f30661f25a68bf1e9042dde6795a9eaeecbd590c1447b0c09e7\": container with ID starting with df8a1ae1a0735f30661f25a68bf1e9042dde6795a9eaeecbd590c1447b0c09e7 not found: ID does not exist" Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.899224 4825 scope.go:117] "RemoveContainer" containerID="3b18a778de6850f71a8e61e08a050b9bb3e16fc846e9699e17a48a4ef4e95ff2" Oct 07 19:23:50 crc kubenswrapper[4825]: E1007 19:23:50.899751 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b18a778de6850f71a8e61e08a050b9bb3e16fc846e9699e17a48a4ef4e95ff2\": container with ID starting with 3b18a778de6850f71a8e61e08a050b9bb3e16fc846e9699e17a48a4ef4e95ff2 not found: ID does not exist" containerID="3b18a778de6850f71a8e61e08a050b9bb3e16fc846e9699e17a48a4ef4e95ff2" Oct 07 19:23:50 crc kubenswrapper[4825]: I1007 19:23:50.899791 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b18a778de6850f71a8e61e08a050b9bb3e16fc846e9699e17a48a4ef4e95ff2"} err="failed to get container status \"3b18a778de6850f71a8e61e08a050b9bb3e16fc846e9699e17a48a4ef4e95ff2\": rpc error: code = NotFound desc = could not find container \"3b18a778de6850f71a8e61e08a050b9bb3e16fc846e9699e17a48a4ef4e95ff2\": container with ID starting with 3b18a778de6850f71a8e61e08a050b9bb3e16fc846e9699e17a48a4ef4e95ff2 not found: ID does not exist" Oct 07 19:23:51 crc kubenswrapper[4825]: I1007 19:23:51.817941 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0" path="/var/lib/kubelet/pods/0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0/volumes" Oct 07 19:23:54 crc kubenswrapper[4825]: I1007 19:23:54.950491 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h2l4w" Oct 07 19:23:54 crc kubenswrapper[4825]: I1007 19:23:54.950809 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h2l4w" Oct 07 19:23:55 crc kubenswrapper[4825]: I1007 19:23:55.017631 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h2l4w" Oct 07 19:23:55 crc kubenswrapper[4825]: I1007 19:23:55.957408 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h2l4w" Oct 07 19:23:56 crc kubenswrapper[4825]: I1007 19:23:56.022778 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h2l4w"] Oct 07 19:23:57 crc kubenswrapper[4825]: I1007 19:23:57.814917 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5g6rm"] Oct 07 19:23:57 crc kubenswrapper[4825]: E1007 19:23:57.817571 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0" containerName="extract-content" Oct 07 19:23:57 crc kubenswrapper[4825]: I1007 19:23:57.817709 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0" containerName="extract-content" Oct 07 19:23:57 crc kubenswrapper[4825]: E1007 19:23:57.817815 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0" containerName="registry-server" Oct 07 19:23:57 crc kubenswrapper[4825]: I1007 19:23:57.817875 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0" containerName="registry-server" Oct 07 19:23:57 crc kubenswrapper[4825]: E1007 19:23:57.817949 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0" containerName="extract-utilities" Oct 07 19:23:57 crc kubenswrapper[4825]: I1007 19:23:57.818000 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0" containerName="extract-utilities" Oct 07 19:23:57 crc kubenswrapper[4825]: I1007 19:23:57.827000 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b86c5ab-2af5-4bb1-b5f1-4ba9d53e90a0" containerName="registry-server" Oct 07 19:23:57 crc kubenswrapper[4825]: I1007 19:23:57.832746 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5g6rm"] Oct 07 19:23:57 crc kubenswrapper[4825]: I1007 19:23:57.832888 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5g6rm" Oct 07 19:23:57 crc kubenswrapper[4825]: I1007 19:23:57.903096 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h2l4w" podUID="88cc094b-caeb-4d99-a729-1de502f39008" containerName="registry-server" containerID="cri-o://f7bbb91024c501d958c73d081d2ced563a9cbc5a65d65e16cf50a887bef85908" gracePeriod=2 Oct 07 19:23:57 crc kubenswrapper[4825]: I1007 19:23:57.912703 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tnzt\" (UniqueName: \"kubernetes.io/projected/a01ea531-6435-43c5-ac97-a2f19f511b39-kube-api-access-5tnzt\") pod \"redhat-marketplace-5g6rm\" (UID: \"a01ea531-6435-43c5-ac97-a2f19f511b39\") " pod="openshift-marketplace/redhat-marketplace-5g6rm" Oct 07 19:23:57 crc kubenswrapper[4825]: I1007 19:23:57.912771 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a01ea531-6435-43c5-ac97-a2f19f511b39-utilities\") pod \"redhat-marketplace-5g6rm\" (UID: \"a01ea531-6435-43c5-ac97-a2f19f511b39\") " pod="openshift-marketplace/redhat-marketplace-5g6rm" Oct 07 19:23:57 crc kubenswrapper[4825]: I1007 19:23:57.913294 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a01ea531-6435-43c5-ac97-a2f19f511b39-catalog-content\") pod \"redhat-marketplace-5g6rm\" (UID: \"a01ea531-6435-43c5-ac97-a2f19f511b39\") " pod="openshift-marketplace/redhat-marketplace-5g6rm" Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.015924 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a01ea531-6435-43c5-ac97-a2f19f511b39-catalog-content\") pod \"redhat-marketplace-5g6rm\" (UID: \"a01ea531-6435-43c5-ac97-a2f19f511b39\") " pod="openshift-marketplace/redhat-marketplace-5g6rm" Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.016362 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tnzt\" (UniqueName: \"kubernetes.io/projected/a01ea531-6435-43c5-ac97-a2f19f511b39-kube-api-access-5tnzt\") pod \"redhat-marketplace-5g6rm\" (UID: \"a01ea531-6435-43c5-ac97-a2f19f511b39\") " pod="openshift-marketplace/redhat-marketplace-5g6rm" Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.016459 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a01ea531-6435-43c5-ac97-a2f19f511b39-catalog-content\") pod \"redhat-marketplace-5g6rm\" (UID: \"a01ea531-6435-43c5-ac97-a2f19f511b39\") " pod="openshift-marketplace/redhat-marketplace-5g6rm" Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.016576 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a01ea531-6435-43c5-ac97-a2f19f511b39-utilities\") pod \"redhat-marketplace-5g6rm\" (UID: \"a01ea531-6435-43c5-ac97-a2f19f511b39\") " pod="openshift-marketplace/redhat-marketplace-5g6rm" Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.017210 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a01ea531-6435-43c5-ac97-a2f19f511b39-utilities\") pod \"redhat-marketplace-5g6rm\" (UID: \"a01ea531-6435-43c5-ac97-a2f19f511b39\") " pod="openshift-marketplace/redhat-marketplace-5g6rm" Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.037169 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tnzt\" (UniqueName: \"kubernetes.io/projected/a01ea531-6435-43c5-ac97-a2f19f511b39-kube-api-access-5tnzt\") pod \"redhat-marketplace-5g6rm\" (UID: \"a01ea531-6435-43c5-ac97-a2f19f511b39\") " pod="openshift-marketplace/redhat-marketplace-5g6rm" Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.159242 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5g6rm" Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.633405 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5g6rm"] Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.805460 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h2l4w" Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.912120 4825 generic.go:334] "Generic (PLEG): container finished" podID="a01ea531-6435-43c5-ac97-a2f19f511b39" containerID="a53e5eb11ae3e2360da458e4ef76a6a3cf884532457db0efab5cfd054762eceb" exitCode=0 Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.912237 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5g6rm" event={"ID":"a01ea531-6435-43c5-ac97-a2f19f511b39","Type":"ContainerDied","Data":"a53e5eb11ae3e2360da458e4ef76a6a3cf884532457db0efab5cfd054762eceb"} Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.912283 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5g6rm" event={"ID":"a01ea531-6435-43c5-ac97-a2f19f511b39","Type":"ContainerStarted","Data":"135a50be462e1790c2739514a8f87512dba2e232b8a8a6eca307f063b322f6c2"} Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.918416 4825 generic.go:334] "Generic (PLEG): container finished" podID="88cc094b-caeb-4d99-a729-1de502f39008" containerID="f7bbb91024c501d958c73d081d2ced563a9cbc5a65d65e16cf50a887bef85908" exitCode=0 Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.918468 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2l4w" event={"ID":"88cc094b-caeb-4d99-a729-1de502f39008","Type":"ContainerDied","Data":"f7bbb91024c501d958c73d081d2ced563a9cbc5a65d65e16cf50a887bef85908"} Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.918505 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2l4w" event={"ID":"88cc094b-caeb-4d99-a729-1de502f39008","Type":"ContainerDied","Data":"91039d3236a2292a3d950ddd016abc021ef22a9197519df4bd9fad41b0e56e30"} Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.918532 4825 scope.go:117] "RemoveContainer" containerID="f7bbb91024c501d958c73d081d2ced563a9cbc5a65d65e16cf50a887bef85908" Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.918778 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h2l4w" Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.935925 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88cc094b-caeb-4d99-a729-1de502f39008-catalog-content\") pod \"88cc094b-caeb-4d99-a729-1de502f39008\" (UID: \"88cc094b-caeb-4d99-a729-1de502f39008\") " Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.937627 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88cc094b-caeb-4d99-a729-1de502f39008-utilities\") pod \"88cc094b-caeb-4d99-a729-1de502f39008\" (UID: \"88cc094b-caeb-4d99-a729-1de502f39008\") " Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.937671 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8llmf\" (UniqueName: \"kubernetes.io/projected/88cc094b-caeb-4d99-a729-1de502f39008-kube-api-access-8llmf\") pod \"88cc094b-caeb-4d99-a729-1de502f39008\" (UID: \"88cc094b-caeb-4d99-a729-1de502f39008\") " Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.939453 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88cc094b-caeb-4d99-a729-1de502f39008-utilities" (OuterVolumeSpecName: "utilities") pod "88cc094b-caeb-4d99-a729-1de502f39008" (UID: "88cc094b-caeb-4d99-a729-1de502f39008"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.941621 4825 scope.go:117] "RemoveContainer" containerID="4df04d5ad47efe25ca1c21adf98be4550ec535d8f57261de7ac3d1e3cfc27345" Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.945021 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88cc094b-caeb-4d99-a729-1de502f39008-kube-api-access-8llmf" (OuterVolumeSpecName: "kube-api-access-8llmf") pod "88cc094b-caeb-4d99-a729-1de502f39008" (UID: "88cc094b-caeb-4d99-a729-1de502f39008"). InnerVolumeSpecName "kube-api-access-8llmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.960179 4825 scope.go:117] "RemoveContainer" containerID="416dbaad9ae190db6acf35be5b44587e08f821cd9e20b31a68cc6565fae49655" Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.981990 4825 scope.go:117] "RemoveContainer" containerID="f7bbb91024c501d958c73d081d2ced563a9cbc5a65d65e16cf50a887bef85908" Oct 07 19:23:58 crc kubenswrapper[4825]: E1007 19:23:58.982501 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7bbb91024c501d958c73d081d2ced563a9cbc5a65d65e16cf50a887bef85908\": container with ID starting with f7bbb91024c501d958c73d081d2ced563a9cbc5a65d65e16cf50a887bef85908 not found: ID does not exist" containerID="f7bbb91024c501d958c73d081d2ced563a9cbc5a65d65e16cf50a887bef85908" Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.982536 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7bbb91024c501d958c73d081d2ced563a9cbc5a65d65e16cf50a887bef85908"} err="failed to get container status \"f7bbb91024c501d958c73d081d2ced563a9cbc5a65d65e16cf50a887bef85908\": rpc error: code = NotFound desc = could not find container \"f7bbb91024c501d958c73d081d2ced563a9cbc5a65d65e16cf50a887bef85908\": container with ID starting with f7bbb91024c501d958c73d081d2ced563a9cbc5a65d65e16cf50a887bef85908 not found: ID does not exist" Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.982558 4825 scope.go:117] "RemoveContainer" containerID="4df04d5ad47efe25ca1c21adf98be4550ec535d8f57261de7ac3d1e3cfc27345" Oct 07 19:23:58 crc kubenswrapper[4825]: E1007 19:23:58.982904 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4df04d5ad47efe25ca1c21adf98be4550ec535d8f57261de7ac3d1e3cfc27345\": container with ID starting with 4df04d5ad47efe25ca1c21adf98be4550ec535d8f57261de7ac3d1e3cfc27345 not found: ID does not exist" containerID="4df04d5ad47efe25ca1c21adf98be4550ec535d8f57261de7ac3d1e3cfc27345" Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.982932 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4df04d5ad47efe25ca1c21adf98be4550ec535d8f57261de7ac3d1e3cfc27345"} err="failed to get container status \"4df04d5ad47efe25ca1c21adf98be4550ec535d8f57261de7ac3d1e3cfc27345\": rpc error: code = NotFound desc = could not find container \"4df04d5ad47efe25ca1c21adf98be4550ec535d8f57261de7ac3d1e3cfc27345\": container with ID starting with 4df04d5ad47efe25ca1c21adf98be4550ec535d8f57261de7ac3d1e3cfc27345 not found: ID does not exist" Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.982950 4825 scope.go:117] "RemoveContainer" containerID="416dbaad9ae190db6acf35be5b44587e08f821cd9e20b31a68cc6565fae49655" Oct 07 19:23:58 crc kubenswrapper[4825]: E1007 19:23:58.983384 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"416dbaad9ae190db6acf35be5b44587e08f821cd9e20b31a68cc6565fae49655\": container with ID starting with 416dbaad9ae190db6acf35be5b44587e08f821cd9e20b31a68cc6565fae49655 not found: ID does not exist" containerID="416dbaad9ae190db6acf35be5b44587e08f821cd9e20b31a68cc6565fae49655" Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.983434 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"416dbaad9ae190db6acf35be5b44587e08f821cd9e20b31a68cc6565fae49655"} err="failed to get container status \"416dbaad9ae190db6acf35be5b44587e08f821cd9e20b31a68cc6565fae49655\": rpc error: code = NotFound desc = could not find container \"416dbaad9ae190db6acf35be5b44587e08f821cd9e20b31a68cc6565fae49655\": container with ID starting with 416dbaad9ae190db6acf35be5b44587e08f821cd9e20b31a68cc6565fae49655 not found: ID does not exist" Oct 07 19:23:58 crc kubenswrapper[4825]: I1007 19:23:58.985195 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88cc094b-caeb-4d99-a729-1de502f39008-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88cc094b-caeb-4d99-a729-1de502f39008" (UID: "88cc094b-caeb-4d99-a729-1de502f39008"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:23:59 crc kubenswrapper[4825]: I1007 19:23:59.040165 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88cc094b-caeb-4d99-a729-1de502f39008-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:23:59 crc kubenswrapper[4825]: I1007 19:23:59.040192 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8llmf\" (UniqueName: \"kubernetes.io/projected/88cc094b-caeb-4d99-a729-1de502f39008-kube-api-access-8llmf\") on node \"crc\" DevicePath \"\"" Oct 07 19:23:59 crc kubenswrapper[4825]: I1007 19:23:59.040203 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88cc094b-caeb-4d99-a729-1de502f39008-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:23:59 crc kubenswrapper[4825]: I1007 19:23:59.256270 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h2l4w"] Oct 07 19:23:59 crc kubenswrapper[4825]: I1007 19:23:59.271374 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h2l4w"] Oct 07 19:23:59 crc kubenswrapper[4825]: I1007 19:23:59.812693 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88cc094b-caeb-4d99-a729-1de502f39008" path="/var/lib/kubelet/pods/88cc094b-caeb-4d99-a729-1de502f39008/volumes" Oct 07 19:24:00 crc kubenswrapper[4825]: I1007 19:24:00.954099 4825 generic.go:334] "Generic (PLEG): container finished" podID="a01ea531-6435-43c5-ac97-a2f19f511b39" containerID="bed3b9c50107115a9f34a51a7526b211158313a92ed71bffe55627dbe8e58122" exitCode=0 Oct 07 19:24:00 crc kubenswrapper[4825]: I1007 19:24:00.954167 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5g6rm" event={"ID":"a01ea531-6435-43c5-ac97-a2f19f511b39","Type":"ContainerDied","Data":"bed3b9c50107115a9f34a51a7526b211158313a92ed71bffe55627dbe8e58122"} Oct 07 19:24:01 crc kubenswrapper[4825]: I1007 19:24:01.976695 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5g6rm" event={"ID":"a01ea531-6435-43c5-ac97-a2f19f511b39","Type":"ContainerStarted","Data":"77be598d2437406f09423d49abec805ffaefd0deb69a4519d7ab24122017dcc7"} Oct 07 19:24:02 crc kubenswrapper[4825]: I1007 19:24:02.013731 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5g6rm" podStartSLOduration=2.393231014 podStartE2EDuration="5.013705894s" podCreationTimestamp="2025-10-07 19:23:57 +0000 UTC" firstStartedPulling="2025-10-07 19:23:58.914000463 +0000 UTC m=+1427.736039100" lastFinishedPulling="2025-10-07 19:24:01.534475333 +0000 UTC m=+1430.356513980" observedRunningTime="2025-10-07 19:24:02.003141134 +0000 UTC m=+1430.825179821" watchObservedRunningTime="2025-10-07 19:24:02.013705894 +0000 UTC m=+1430.835744571" Oct 07 19:24:05 crc kubenswrapper[4825]: I1007 19:24:05.709023 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:24:05 crc kubenswrapper[4825]: I1007 19:24:05.709533 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:24:05 crc kubenswrapper[4825]: I1007 19:24:05.709590 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:24:05 crc kubenswrapper[4825]: I1007 19:24:05.710345 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bb100741ed4991dde5bb64dc9ab561e9fa009739dcc0f0f2c8261720803021e4"} pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 19:24:05 crc kubenswrapper[4825]: I1007 19:24:05.710399 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" containerID="cri-o://bb100741ed4991dde5bb64dc9ab561e9fa009739dcc0f0f2c8261720803021e4" gracePeriod=600 Oct 07 19:24:06 crc kubenswrapper[4825]: I1007 19:24:06.019625 4825 generic.go:334] "Generic (PLEG): container finished" podID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerID="bb100741ed4991dde5bb64dc9ab561e9fa009739dcc0f0f2c8261720803021e4" exitCode=0 Oct 07 19:24:06 crc kubenswrapper[4825]: I1007 19:24:06.019685 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerDied","Data":"bb100741ed4991dde5bb64dc9ab561e9fa009739dcc0f0f2c8261720803021e4"} Oct 07 19:24:06 crc kubenswrapper[4825]: I1007 19:24:06.020021 4825 scope.go:117] "RemoveContainer" containerID="7a129d547f9c2f005540980fa89f701d13b633e45c1d0e5a234b2420081b437f" Oct 07 19:24:07 crc kubenswrapper[4825]: I1007 19:24:07.036993 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerStarted","Data":"5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677"} Oct 07 19:24:08 crc kubenswrapper[4825]: I1007 19:24:08.159414 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5g6rm" Oct 07 19:24:08 crc kubenswrapper[4825]: I1007 19:24:08.159760 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5g6rm" Oct 07 19:24:08 crc kubenswrapper[4825]: I1007 19:24:08.224792 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5g6rm" Oct 07 19:24:09 crc kubenswrapper[4825]: I1007 19:24:09.153073 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5g6rm" Oct 07 19:24:09 crc kubenswrapper[4825]: I1007 19:24:09.216660 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5g6rm"] Oct 07 19:24:11 crc kubenswrapper[4825]: I1007 19:24:11.085104 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5g6rm" podUID="a01ea531-6435-43c5-ac97-a2f19f511b39" containerName="registry-server" containerID="cri-o://77be598d2437406f09423d49abec805ffaefd0deb69a4519d7ab24122017dcc7" gracePeriod=2 Oct 07 19:24:11 crc kubenswrapper[4825]: I1007 19:24:11.584661 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5g6rm" Oct 07 19:24:11 crc kubenswrapper[4825]: I1007 19:24:11.709920 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a01ea531-6435-43c5-ac97-a2f19f511b39-utilities\") pod \"a01ea531-6435-43c5-ac97-a2f19f511b39\" (UID: \"a01ea531-6435-43c5-ac97-a2f19f511b39\") " Oct 07 19:24:11 crc kubenswrapper[4825]: I1007 19:24:11.710001 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a01ea531-6435-43c5-ac97-a2f19f511b39-catalog-content\") pod \"a01ea531-6435-43c5-ac97-a2f19f511b39\" (UID: \"a01ea531-6435-43c5-ac97-a2f19f511b39\") " Oct 07 19:24:11 crc kubenswrapper[4825]: I1007 19:24:11.710088 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tnzt\" (UniqueName: \"kubernetes.io/projected/a01ea531-6435-43c5-ac97-a2f19f511b39-kube-api-access-5tnzt\") pod \"a01ea531-6435-43c5-ac97-a2f19f511b39\" (UID: \"a01ea531-6435-43c5-ac97-a2f19f511b39\") " Oct 07 19:24:11 crc kubenswrapper[4825]: I1007 19:24:11.711680 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a01ea531-6435-43c5-ac97-a2f19f511b39-utilities" (OuterVolumeSpecName: "utilities") pod "a01ea531-6435-43c5-ac97-a2f19f511b39" (UID: "a01ea531-6435-43c5-ac97-a2f19f511b39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:24:11 crc kubenswrapper[4825]: I1007 19:24:11.720620 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a01ea531-6435-43c5-ac97-a2f19f511b39-kube-api-access-5tnzt" (OuterVolumeSpecName: "kube-api-access-5tnzt") pod "a01ea531-6435-43c5-ac97-a2f19f511b39" (UID: "a01ea531-6435-43c5-ac97-a2f19f511b39"). InnerVolumeSpecName "kube-api-access-5tnzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:24:11 crc kubenswrapper[4825]: I1007 19:24:11.722746 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a01ea531-6435-43c5-ac97-a2f19f511b39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a01ea531-6435-43c5-ac97-a2f19f511b39" (UID: "a01ea531-6435-43c5-ac97-a2f19f511b39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:24:11 crc kubenswrapper[4825]: I1007 19:24:11.812782 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tnzt\" (UniqueName: \"kubernetes.io/projected/a01ea531-6435-43c5-ac97-a2f19f511b39-kube-api-access-5tnzt\") on node \"crc\" DevicePath \"\"" Oct 07 19:24:11 crc kubenswrapper[4825]: I1007 19:24:11.813083 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a01ea531-6435-43c5-ac97-a2f19f511b39-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:24:11 crc kubenswrapper[4825]: I1007 19:24:11.813096 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a01ea531-6435-43c5-ac97-a2f19f511b39-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:24:12 crc kubenswrapper[4825]: I1007 19:24:12.098757 4825 generic.go:334] "Generic (PLEG): container finished" podID="a01ea531-6435-43c5-ac97-a2f19f511b39" containerID="77be598d2437406f09423d49abec805ffaefd0deb69a4519d7ab24122017dcc7" exitCode=0 Oct 07 19:24:12 crc kubenswrapper[4825]: I1007 19:24:12.098825 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5g6rm" event={"ID":"a01ea531-6435-43c5-ac97-a2f19f511b39","Type":"ContainerDied","Data":"77be598d2437406f09423d49abec805ffaefd0deb69a4519d7ab24122017dcc7"} Oct 07 19:24:12 crc kubenswrapper[4825]: I1007 19:24:12.098834 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5g6rm" Oct 07 19:24:12 crc kubenswrapper[4825]: I1007 19:24:12.098874 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5g6rm" event={"ID":"a01ea531-6435-43c5-ac97-a2f19f511b39","Type":"ContainerDied","Data":"135a50be462e1790c2739514a8f87512dba2e232b8a8a6eca307f063b322f6c2"} Oct 07 19:24:12 crc kubenswrapper[4825]: I1007 19:24:12.098904 4825 scope.go:117] "RemoveContainer" containerID="77be598d2437406f09423d49abec805ffaefd0deb69a4519d7ab24122017dcc7" Oct 07 19:24:12 crc kubenswrapper[4825]: I1007 19:24:12.136199 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5g6rm"] Oct 07 19:24:12 crc kubenswrapper[4825]: I1007 19:24:12.143764 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5g6rm"] Oct 07 19:24:12 crc kubenswrapper[4825]: I1007 19:24:12.145921 4825 scope.go:117] "RemoveContainer" containerID="bed3b9c50107115a9f34a51a7526b211158313a92ed71bffe55627dbe8e58122" Oct 07 19:24:12 crc kubenswrapper[4825]: I1007 19:24:12.178757 4825 scope.go:117] "RemoveContainer" containerID="a53e5eb11ae3e2360da458e4ef76a6a3cf884532457db0efab5cfd054762eceb" Oct 07 19:24:12 crc kubenswrapper[4825]: I1007 19:24:12.253548 4825 scope.go:117] "RemoveContainer" containerID="77be598d2437406f09423d49abec805ffaefd0deb69a4519d7ab24122017dcc7" Oct 07 19:24:12 crc kubenswrapper[4825]: E1007 19:24:12.255945 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77be598d2437406f09423d49abec805ffaefd0deb69a4519d7ab24122017dcc7\": container with ID starting with 77be598d2437406f09423d49abec805ffaefd0deb69a4519d7ab24122017dcc7 not found: ID does not exist" containerID="77be598d2437406f09423d49abec805ffaefd0deb69a4519d7ab24122017dcc7" Oct 07 19:24:12 crc kubenswrapper[4825]: I1007 19:24:12.256009 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77be598d2437406f09423d49abec805ffaefd0deb69a4519d7ab24122017dcc7"} err="failed to get container status \"77be598d2437406f09423d49abec805ffaefd0deb69a4519d7ab24122017dcc7\": rpc error: code = NotFound desc = could not find container \"77be598d2437406f09423d49abec805ffaefd0deb69a4519d7ab24122017dcc7\": container with ID starting with 77be598d2437406f09423d49abec805ffaefd0deb69a4519d7ab24122017dcc7 not found: ID does not exist" Oct 07 19:24:12 crc kubenswrapper[4825]: I1007 19:24:12.256044 4825 scope.go:117] "RemoveContainer" containerID="bed3b9c50107115a9f34a51a7526b211158313a92ed71bffe55627dbe8e58122" Oct 07 19:24:12 crc kubenswrapper[4825]: E1007 19:24:12.256762 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bed3b9c50107115a9f34a51a7526b211158313a92ed71bffe55627dbe8e58122\": container with ID starting with bed3b9c50107115a9f34a51a7526b211158313a92ed71bffe55627dbe8e58122 not found: ID does not exist" containerID="bed3b9c50107115a9f34a51a7526b211158313a92ed71bffe55627dbe8e58122" Oct 07 19:24:12 crc kubenswrapper[4825]: I1007 19:24:12.256825 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bed3b9c50107115a9f34a51a7526b211158313a92ed71bffe55627dbe8e58122"} err="failed to get container status \"bed3b9c50107115a9f34a51a7526b211158313a92ed71bffe55627dbe8e58122\": rpc error: code = NotFound desc = could not find container \"bed3b9c50107115a9f34a51a7526b211158313a92ed71bffe55627dbe8e58122\": container with ID starting with bed3b9c50107115a9f34a51a7526b211158313a92ed71bffe55627dbe8e58122 not found: ID does not exist" Oct 07 19:24:12 crc kubenswrapper[4825]: I1007 19:24:12.256880 4825 scope.go:117] "RemoveContainer" containerID="a53e5eb11ae3e2360da458e4ef76a6a3cf884532457db0efab5cfd054762eceb" Oct 07 19:24:12 crc kubenswrapper[4825]: E1007 19:24:12.257427 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a53e5eb11ae3e2360da458e4ef76a6a3cf884532457db0efab5cfd054762eceb\": container with ID starting with a53e5eb11ae3e2360da458e4ef76a6a3cf884532457db0efab5cfd054762eceb not found: ID does not exist" containerID="a53e5eb11ae3e2360da458e4ef76a6a3cf884532457db0efab5cfd054762eceb" Oct 07 19:24:12 crc kubenswrapper[4825]: I1007 19:24:12.257456 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a53e5eb11ae3e2360da458e4ef76a6a3cf884532457db0efab5cfd054762eceb"} err="failed to get container status \"a53e5eb11ae3e2360da458e4ef76a6a3cf884532457db0efab5cfd054762eceb\": rpc error: code = NotFound desc = could not find container \"a53e5eb11ae3e2360da458e4ef76a6a3cf884532457db0efab5cfd054762eceb\": container with ID starting with a53e5eb11ae3e2360da458e4ef76a6a3cf884532457db0efab5cfd054762eceb not found: ID does not exist" Oct 07 19:24:13 crc kubenswrapper[4825]: I1007 19:24:13.814557 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a01ea531-6435-43c5-ac97-a2f19f511b39" path="/var/lib/kubelet/pods/a01ea531-6435-43c5-ac97-a2f19f511b39/volumes" Oct 07 19:24:46 crc kubenswrapper[4825]: I1007 19:24:46.562923 4825 scope.go:117] "RemoveContainer" containerID="40621b2130780439219c836e4dd5f0a90517746ff37a56bcf45a53e4dab5c21f" Oct 07 19:24:46 crc kubenswrapper[4825]: I1007 19:24:46.627364 4825 scope.go:117] "RemoveContainer" containerID="e99899481fd0f3aabd7404eeb522dae7f72fe8e8eec6445f5a136a9621e0ccca" Oct 07 19:24:46 crc kubenswrapper[4825]: I1007 19:24:46.687756 4825 scope.go:117] "RemoveContainer" containerID="397f277ef97bc2edcb05128de9b277667c3730f74dce3268c7e03df1a242382a" Oct 07 19:25:46 crc kubenswrapper[4825]: I1007 19:25:46.872384 4825 scope.go:117] "RemoveContainer" containerID="6cc2d532cfcc18c8cb9531818e4d3f565130eb88bb11eebcd3d82a23f350d9bf" Oct 07 19:25:46 crc kubenswrapper[4825]: I1007 19:25:46.909692 4825 scope.go:117] "RemoveContainer" containerID="348a031751fa888be540da3478681d04b11c301a45d453dc05e08fa79bd63773" Oct 07 19:26:35 crc kubenswrapper[4825]: I1007 19:26:35.708798 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:26:35 crc kubenswrapper[4825]: I1007 19:26:35.710499 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:26:38 crc kubenswrapper[4825]: I1007 19:26:38.902975 4825 generic.go:334] "Generic (PLEG): container finished" podID="ede848ae-130b-4c5c-a4fb-873d9ea65cb6" containerID="1efabbfa0b707b79b34d2b27c971f057d921c1894a166720a2784adf6f2e5446" exitCode=0 Oct 07 19:26:38 crc kubenswrapper[4825]: I1007 19:26:38.903107 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2" event={"ID":"ede848ae-130b-4c5c-a4fb-873d9ea65cb6","Type":"ContainerDied","Data":"1efabbfa0b707b79b34d2b27c971f057d921c1894a166720a2784adf6f2e5446"} Oct 07 19:26:40 crc kubenswrapper[4825]: I1007 19:26:40.336138 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2" Oct 07 19:26:40 crc kubenswrapper[4825]: I1007 19:26:40.390266 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ede848ae-130b-4c5c-a4fb-873d9ea65cb6-ssh-key\") pod \"ede848ae-130b-4c5c-a4fb-873d9ea65cb6\" (UID: \"ede848ae-130b-4c5c-a4fb-873d9ea65cb6\") " Oct 07 19:26:40 crc kubenswrapper[4825]: I1007 19:26:40.390407 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt9j2\" (UniqueName: \"kubernetes.io/projected/ede848ae-130b-4c5c-a4fb-873d9ea65cb6-kube-api-access-vt9j2\") pod \"ede848ae-130b-4c5c-a4fb-873d9ea65cb6\" (UID: \"ede848ae-130b-4c5c-a4fb-873d9ea65cb6\") " Oct 07 19:26:40 crc kubenswrapper[4825]: I1007 19:26:40.390469 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ede848ae-130b-4c5c-a4fb-873d9ea65cb6-bootstrap-combined-ca-bundle\") pod \"ede848ae-130b-4c5c-a4fb-873d9ea65cb6\" (UID: \"ede848ae-130b-4c5c-a4fb-873d9ea65cb6\") " Oct 07 19:26:40 crc kubenswrapper[4825]: I1007 19:26:40.390514 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ede848ae-130b-4c5c-a4fb-873d9ea65cb6-inventory\") pod \"ede848ae-130b-4c5c-a4fb-873d9ea65cb6\" (UID: \"ede848ae-130b-4c5c-a4fb-873d9ea65cb6\") " Oct 07 19:26:40 crc kubenswrapper[4825]: I1007 19:26:40.396287 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ede848ae-130b-4c5c-a4fb-873d9ea65cb6-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "ede848ae-130b-4c5c-a4fb-873d9ea65cb6" (UID: "ede848ae-130b-4c5c-a4fb-873d9ea65cb6"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:26:40 crc kubenswrapper[4825]: I1007 19:26:40.400534 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ede848ae-130b-4c5c-a4fb-873d9ea65cb6-kube-api-access-vt9j2" (OuterVolumeSpecName: "kube-api-access-vt9j2") pod "ede848ae-130b-4c5c-a4fb-873d9ea65cb6" (UID: "ede848ae-130b-4c5c-a4fb-873d9ea65cb6"). InnerVolumeSpecName "kube-api-access-vt9j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:26:40 crc kubenswrapper[4825]: I1007 19:26:40.418762 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ede848ae-130b-4c5c-a4fb-873d9ea65cb6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ede848ae-130b-4c5c-a4fb-873d9ea65cb6" (UID: "ede848ae-130b-4c5c-a4fb-873d9ea65cb6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:26:40 crc kubenswrapper[4825]: I1007 19:26:40.424403 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ede848ae-130b-4c5c-a4fb-873d9ea65cb6-inventory" (OuterVolumeSpecName: "inventory") pod "ede848ae-130b-4c5c-a4fb-873d9ea65cb6" (UID: "ede848ae-130b-4c5c-a4fb-873d9ea65cb6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:26:40 crc kubenswrapper[4825]: I1007 19:26:40.494611 4825 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ede848ae-130b-4c5c-a4fb-873d9ea65cb6-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:26:40 crc kubenswrapper[4825]: I1007 19:26:40.494670 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ede848ae-130b-4c5c-a4fb-873d9ea65cb6-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 19:26:40 crc kubenswrapper[4825]: I1007 19:26:40.494695 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ede848ae-130b-4c5c-a4fb-873d9ea65cb6-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 19:26:40 crc kubenswrapper[4825]: I1007 19:26:40.494719 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt9j2\" (UniqueName: \"kubernetes.io/projected/ede848ae-130b-4c5c-a4fb-873d9ea65cb6-kube-api-access-vt9j2\") on node \"crc\" DevicePath \"\"" Oct 07 19:26:40 crc kubenswrapper[4825]: I1007 19:26:40.927947 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2" event={"ID":"ede848ae-130b-4c5c-a4fb-873d9ea65cb6","Type":"ContainerDied","Data":"8a2aa17271b3e6f27e7cc554523d84dfd72a0e1889be4d36882ffcd5c1f2c5ff"} Oct 07 19:26:40 crc kubenswrapper[4825]: I1007 19:26:40.928015 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2" Oct 07 19:26:40 crc kubenswrapper[4825]: I1007 19:26:40.928072 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a2aa17271b3e6f27e7cc554523d84dfd72a0e1889be4d36882ffcd5c1f2c5ff" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.035842 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ngshx"] Oct 07 19:26:41 crc kubenswrapper[4825]: E1007 19:26:41.036671 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ede848ae-130b-4c5c-a4fb-873d9ea65cb6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.036789 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ede848ae-130b-4c5c-a4fb-873d9ea65cb6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 07 19:26:41 crc kubenswrapper[4825]: E1007 19:26:41.036874 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88cc094b-caeb-4d99-a729-1de502f39008" containerName="extract-content" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.036950 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="88cc094b-caeb-4d99-a729-1de502f39008" containerName="extract-content" Oct 07 19:26:41 crc kubenswrapper[4825]: E1007 19:26:41.037033 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a01ea531-6435-43c5-ac97-a2f19f511b39" containerName="extract-content" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.037181 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a01ea531-6435-43c5-ac97-a2f19f511b39" containerName="extract-content" Oct 07 19:26:41 crc kubenswrapper[4825]: E1007 19:26:41.037390 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88cc094b-caeb-4d99-a729-1de502f39008" containerName="extract-utilities" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.037555 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="88cc094b-caeb-4d99-a729-1de502f39008" containerName="extract-utilities" Oct 07 19:26:41 crc kubenswrapper[4825]: E1007 19:26:41.037748 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88cc094b-caeb-4d99-a729-1de502f39008" containerName="registry-server" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.037952 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="88cc094b-caeb-4d99-a729-1de502f39008" containerName="registry-server" Oct 07 19:26:41 crc kubenswrapper[4825]: E1007 19:26:41.038196 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a01ea531-6435-43c5-ac97-a2f19f511b39" containerName="extract-utilities" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.038655 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a01ea531-6435-43c5-ac97-a2f19f511b39" containerName="extract-utilities" Oct 07 19:26:41 crc kubenswrapper[4825]: E1007 19:26:41.038934 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a01ea531-6435-43c5-ac97-a2f19f511b39" containerName="registry-server" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.039099 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a01ea531-6435-43c5-ac97-a2f19f511b39" containerName="registry-server" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.039550 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a01ea531-6435-43c5-ac97-a2f19f511b39" containerName="registry-server" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.039695 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ede848ae-130b-4c5c-a4fb-873d9ea65cb6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.039811 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="88cc094b-caeb-4d99-a729-1de502f39008" containerName="registry-server" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.040903 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ngshx" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.043745 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.044040 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lr8sm" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.045160 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ngshx"] Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.046784 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.047477 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.137203 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35f68c5c-870d-448d-a680-decef3790f6b-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ngshx\" (UID: \"35f68c5c-870d-448d-a680-decef3790f6b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ngshx" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.137282 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgj6n\" (UniqueName: \"kubernetes.io/projected/35f68c5c-870d-448d-a680-decef3790f6b-kube-api-access-vgj6n\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ngshx\" (UID: \"35f68c5c-870d-448d-a680-decef3790f6b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ngshx" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.137319 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35f68c5c-870d-448d-a680-decef3790f6b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ngshx\" (UID: \"35f68c5c-870d-448d-a680-decef3790f6b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ngshx" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.239156 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35f68c5c-870d-448d-a680-decef3790f6b-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ngshx\" (UID: \"35f68c5c-870d-448d-a680-decef3790f6b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ngshx" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.239455 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgj6n\" (UniqueName: \"kubernetes.io/projected/35f68c5c-870d-448d-a680-decef3790f6b-kube-api-access-vgj6n\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ngshx\" (UID: \"35f68c5c-870d-448d-a680-decef3790f6b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ngshx" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.239489 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35f68c5c-870d-448d-a680-decef3790f6b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ngshx\" (UID: \"35f68c5c-870d-448d-a680-decef3790f6b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ngshx" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.247608 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35f68c5c-870d-448d-a680-decef3790f6b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ngshx\" (UID: \"35f68c5c-870d-448d-a680-decef3790f6b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ngshx" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.253838 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35f68c5c-870d-448d-a680-decef3790f6b-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ngshx\" (UID: \"35f68c5c-870d-448d-a680-decef3790f6b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ngshx" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.265172 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgj6n\" (UniqueName: \"kubernetes.io/projected/35f68c5c-870d-448d-a680-decef3790f6b-kube-api-access-vgj6n\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ngshx\" (UID: \"35f68c5c-870d-448d-a680-decef3790f6b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ngshx" Oct 07 19:26:41 crc kubenswrapper[4825]: I1007 19:26:41.377647 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ngshx" Oct 07 19:26:42 crc kubenswrapper[4825]: I1007 19:26:42.003182 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ngshx"] Oct 07 19:26:42 crc kubenswrapper[4825]: I1007 19:26:42.953330 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ngshx" event={"ID":"35f68c5c-870d-448d-a680-decef3790f6b","Type":"ContainerStarted","Data":"ffe8ec11adf3621b46d063b78ef5f56ef20065a586920468e5adabb46f874971"} Oct 07 19:26:42 crc kubenswrapper[4825]: I1007 19:26:42.955304 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ngshx" event={"ID":"35f68c5c-870d-448d-a680-decef3790f6b","Type":"ContainerStarted","Data":"76980a64c75f581fde175972d803cd70fa1dadfd4bcb405484ea47c27072db99"} Oct 07 19:26:42 crc kubenswrapper[4825]: I1007 19:26:42.989771 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ngshx" podStartSLOduration=1.783990723 podStartE2EDuration="1.989744031s" podCreationTimestamp="2025-10-07 19:26:41 +0000 UTC" firstStartedPulling="2025-10-07 19:26:42.023678504 +0000 UTC m=+1590.845717161" lastFinishedPulling="2025-10-07 19:26:42.229431832 +0000 UTC m=+1591.051470469" observedRunningTime="2025-10-07 19:26:42.981541269 +0000 UTC m=+1591.803579906" watchObservedRunningTime="2025-10-07 19:26:42.989744031 +0000 UTC m=+1591.811782698" Oct 07 19:26:46 crc kubenswrapper[4825]: I1007 19:26:46.984784 4825 scope.go:117] "RemoveContainer" containerID="42711cfc8a81e917fed2e13fd9024ac1f313c82d8d697309eab32f85869245da" Oct 07 19:26:47 crc kubenswrapper[4825]: I1007 19:26:47.012350 4825 scope.go:117] "RemoveContainer" containerID="966133ceb48dc55ea1124a42f971ca7bf564ecadfed153496ac2559fed45aee7" Oct 07 19:26:47 crc kubenswrapper[4825]: I1007 19:26:47.043338 4825 scope.go:117] "RemoveContainer" containerID="322f2c205fb06cfdc446834b51771e7e1be2cf388fa434fce126d96b3348cd7e" Oct 07 19:27:05 crc kubenswrapper[4825]: I1007 19:27:05.708952 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:27:05 crc kubenswrapper[4825]: I1007 19:27:05.709845 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:27:34 crc kubenswrapper[4825]: I1007 19:27:34.054392 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-dh8kx"] Oct 07 19:27:34 crc kubenswrapper[4825]: I1007 19:27:34.064042 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-dbtvh"] Oct 07 19:27:34 crc kubenswrapper[4825]: I1007 19:27:34.074927 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-dh8kx"] Oct 07 19:27:34 crc kubenswrapper[4825]: I1007 19:27:34.082776 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-dbtvh"] Oct 07 19:27:35 crc kubenswrapper[4825]: I1007 19:27:35.035302 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-ljvxn"] Oct 07 19:27:35 crc kubenswrapper[4825]: I1007 19:27:35.043436 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-ljvxn"] Oct 07 19:27:35 crc kubenswrapper[4825]: I1007 19:27:35.708643 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:27:35 crc kubenswrapper[4825]: I1007 19:27:35.708725 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:27:35 crc kubenswrapper[4825]: I1007 19:27:35.708797 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:27:35 crc kubenswrapper[4825]: I1007 19:27:35.710368 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677"} pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 19:27:35 crc kubenswrapper[4825]: I1007 19:27:35.710442 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" containerID="cri-o://5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" gracePeriod=600 Oct 07 19:27:35 crc kubenswrapper[4825]: I1007 19:27:35.810793 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="005059cf-fde6-46af-9e47-def8362671af" path="/var/lib/kubelet/pods/005059cf-fde6-46af-9e47-def8362671af/volumes" Oct 07 19:27:35 crc kubenswrapper[4825]: I1007 19:27:35.811360 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d69393f2-e05f-41e8-89ca-a8aa9717edf1" path="/var/lib/kubelet/pods/d69393f2-e05f-41e8-89ca-a8aa9717edf1/volumes" Oct 07 19:27:35 crc kubenswrapper[4825]: I1007 19:27:35.811820 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e295199f-c701-41c4-a4a2-4cd8a1897681" path="/var/lib/kubelet/pods/e295199f-c701-41c4-a4a2-4cd8a1897681/volumes" Oct 07 19:27:35 crc kubenswrapper[4825]: E1007 19:27:35.833047 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:27:36 crc kubenswrapper[4825]: I1007 19:27:36.585165 4825 generic.go:334] "Generic (PLEG): container finished" podID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" exitCode=0 Oct 07 19:27:36 crc kubenswrapper[4825]: I1007 19:27:36.585262 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerDied","Data":"5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677"} Oct 07 19:27:36 crc kubenswrapper[4825]: I1007 19:27:36.585576 4825 scope.go:117] "RemoveContainer" containerID="bb100741ed4991dde5bb64dc9ab561e9fa009739dcc0f0f2c8261720803021e4" Oct 07 19:27:36 crc kubenswrapper[4825]: I1007 19:27:36.586341 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:27:36 crc kubenswrapper[4825]: E1007 19:27:36.586626 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:27:44 crc kubenswrapper[4825]: I1007 19:27:44.037065 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-c5cc-account-create-2mfjn"] Oct 07 19:27:44 crc kubenswrapper[4825]: I1007 19:27:44.047382 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c2d6-account-create-2m5ms"] Oct 07 19:27:44 crc kubenswrapper[4825]: I1007 19:27:44.055600 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-c5cc-account-create-2mfjn"] Oct 07 19:27:44 crc kubenswrapper[4825]: I1007 19:27:44.062122 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-c2d6-account-create-2m5ms"] Oct 07 19:27:45 crc kubenswrapper[4825]: I1007 19:27:45.045524 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-0b20-account-create-rjd5j"] Oct 07 19:27:45 crc kubenswrapper[4825]: I1007 19:27:45.056696 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-0b20-account-create-rjd5j"] Oct 07 19:27:45 crc kubenswrapper[4825]: I1007 19:27:45.809157 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1172748f-cd1c-42f8-86d2-1da4cd47b03a" path="/var/lib/kubelet/pods/1172748f-cd1c-42f8-86d2-1da4cd47b03a/volumes" Oct 07 19:27:45 crc kubenswrapper[4825]: I1007 19:27:45.810203 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac5fcb37-d0be-4175-a3e0-b5589250ccd5" path="/var/lib/kubelet/pods/ac5fcb37-d0be-4175-a3e0-b5589250ccd5/volumes" Oct 07 19:27:45 crc kubenswrapper[4825]: I1007 19:27:45.811177 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db3cf1a1-3d5d-4acf-b424-c215ca427d3e" path="/var/lib/kubelet/pods/db3cf1a1-3d5d-4acf-b424-c215ca427d3e/volumes" Oct 07 19:27:47 crc kubenswrapper[4825]: I1007 19:27:47.131849 4825 scope.go:117] "RemoveContainer" containerID="adc9cabd9033f2005d4e39c9bb95c2237951be0da8326cbdc65d2292147de465" Oct 07 19:27:47 crc kubenswrapper[4825]: I1007 19:27:47.159858 4825 scope.go:117] "RemoveContainer" containerID="777b6dcead485ffa0a13b0b10bf4cbac471ea96b1183bd40f9df13682a518ec2" Oct 07 19:27:47 crc kubenswrapper[4825]: I1007 19:27:47.204722 4825 scope.go:117] "RemoveContainer" containerID="ea01e24da72514c65848fb39b7e2e7ad1508bdb376f291a214d63d4264607f7f" Oct 07 19:27:47 crc kubenswrapper[4825]: I1007 19:27:47.240221 4825 scope.go:117] "RemoveContainer" containerID="cc8b159a89d0eff39eceec9651706693ffcca0a14ba51fbc537e8e502bc47306" Oct 07 19:27:47 crc kubenswrapper[4825]: I1007 19:27:47.297400 4825 scope.go:117] "RemoveContainer" containerID="336720f4ec97f4c6f4a84147a1bb3dd9b49938348276b7bfd58a4cb20b68e31e" Oct 07 19:27:47 crc kubenswrapper[4825]: I1007 19:27:47.336693 4825 scope.go:117] "RemoveContainer" containerID="fcf57d3eadb0cabfe0e63d6fc61b5ed24f23b9319bb52363dbf9f08302c69f5f" Oct 07 19:27:47 crc kubenswrapper[4825]: I1007 19:27:47.796418 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:27:47 crc kubenswrapper[4825]: E1007 19:27:47.796741 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:27:51 crc kubenswrapper[4825]: I1007 19:27:51.024297 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-p8xwm"] Oct 07 19:27:51 crc kubenswrapper[4825]: I1007 19:27:51.034193 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-f5sff"] Oct 07 19:27:51 crc kubenswrapper[4825]: I1007 19:27:51.046196 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-p8xwm"] Oct 07 19:27:51 crc kubenswrapper[4825]: I1007 19:27:51.054052 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-f5sff"] Oct 07 19:27:51 crc kubenswrapper[4825]: I1007 19:27:51.814209 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e2e5b54-c75b-4d58-aff1-ea98ac2f6dd2" path="/var/lib/kubelet/pods/7e2e5b54-c75b-4d58-aff1-ea98ac2f6dd2/volumes" Oct 07 19:27:51 crc kubenswrapper[4825]: I1007 19:27:51.814799 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab5d00f4-8395-4b1b-af14-687bbb9071c5" path="/var/lib/kubelet/pods/ab5d00f4-8395-4b1b-af14-687bbb9071c5/volumes" Oct 07 19:27:52 crc kubenswrapper[4825]: I1007 19:27:52.027201 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-hrjbr"] Oct 07 19:27:52 crc kubenswrapper[4825]: I1007 19:27:52.036790 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-hrjbr"] Oct 07 19:27:53 crc kubenswrapper[4825]: I1007 19:27:53.809480 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac31bb14-31ca-44cc-9e94-6af59e03a578" path="/var/lib/kubelet/pods/ac31bb14-31ca-44cc-9e94-6af59e03a578/volumes" Oct 07 19:27:58 crc kubenswrapper[4825]: I1007 19:27:58.795427 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:27:58 crc kubenswrapper[4825]: E1007 19:27:58.796658 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:28:04 crc kubenswrapper[4825]: I1007 19:28:04.050910 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-4c09-account-create-bgzj4"] Oct 07 19:28:04 crc kubenswrapper[4825]: I1007 19:28:04.059780 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3398-account-create-xz4rk"] Oct 07 19:28:04 crc kubenswrapper[4825]: I1007 19:28:04.068158 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-327b-account-create-rmxxh"] Oct 07 19:28:04 crc kubenswrapper[4825]: I1007 19:28:04.075644 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-3398-account-create-xz4rk"] Oct 07 19:28:04 crc kubenswrapper[4825]: I1007 19:28:04.082060 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-4c09-account-create-bgzj4"] Oct 07 19:28:04 crc kubenswrapper[4825]: I1007 19:28:04.087964 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-327b-account-create-rmxxh"] Oct 07 19:28:05 crc kubenswrapper[4825]: I1007 19:28:05.829341 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13f685da-b7bf-4718-b45c-7c19a681de56" path="/var/lib/kubelet/pods/13f685da-b7bf-4718-b45c-7c19a681de56/volumes" Oct 07 19:28:05 crc kubenswrapper[4825]: I1007 19:28:05.830669 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9661376c-9487-4ac1-af61-e4b4f846f554" path="/var/lib/kubelet/pods/9661376c-9487-4ac1-af61-e4b4f846f554/volumes" Oct 07 19:28:05 crc kubenswrapper[4825]: I1007 19:28:05.831697 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fed9dc70-b8c0-434e-9ee3-68c176c29362" path="/var/lib/kubelet/pods/fed9dc70-b8c0-434e-9ee3-68c176c29362/volumes" Oct 07 19:28:07 crc kubenswrapper[4825]: I1007 19:28:07.030130 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-rnvqv"] Oct 07 19:28:07 crc kubenswrapper[4825]: I1007 19:28:07.038683 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-rnvqv"] Oct 07 19:28:07 crc kubenswrapper[4825]: I1007 19:28:07.810727 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c38775cb-8a0d-4834-8215-af35a9fd4952" path="/var/lib/kubelet/pods/c38775cb-8a0d-4834-8215-af35a9fd4952/volumes" Oct 07 19:28:12 crc kubenswrapper[4825]: I1007 19:28:12.044473 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-cw4gf"] Oct 07 19:28:12 crc kubenswrapper[4825]: I1007 19:28:12.060808 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-cw4gf"] Oct 07 19:28:12 crc kubenswrapper[4825]: I1007 19:28:12.995280 4825 generic.go:334] "Generic (PLEG): container finished" podID="35f68c5c-870d-448d-a680-decef3790f6b" containerID="ffe8ec11adf3621b46d063b78ef5f56ef20065a586920468e5adabb46f874971" exitCode=0 Oct 07 19:28:12 crc kubenswrapper[4825]: I1007 19:28:12.995440 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ngshx" event={"ID":"35f68c5c-870d-448d-a680-decef3790f6b","Type":"ContainerDied","Data":"ffe8ec11adf3621b46d063b78ef5f56ef20065a586920468e5adabb46f874971"} Oct 07 19:28:13 crc kubenswrapper[4825]: I1007 19:28:13.795175 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:28:13 crc kubenswrapper[4825]: E1007 19:28:13.795454 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:28:13 crc kubenswrapper[4825]: I1007 19:28:13.815414 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e981b526-0afb-4a9c-ba89-fe87728f4603" path="/var/lib/kubelet/pods/e981b526-0afb-4a9c-ba89-fe87728f4603/volumes" Oct 07 19:28:14 crc kubenswrapper[4825]: I1007 19:28:14.509879 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ngshx" Oct 07 19:28:14 crc kubenswrapper[4825]: I1007 19:28:14.608788 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35f68c5c-870d-448d-a680-decef3790f6b-inventory\") pod \"35f68c5c-870d-448d-a680-decef3790f6b\" (UID: \"35f68c5c-870d-448d-a680-decef3790f6b\") " Oct 07 19:28:14 crc kubenswrapper[4825]: I1007 19:28:14.609271 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35f68c5c-870d-448d-a680-decef3790f6b-ssh-key\") pod \"35f68c5c-870d-448d-a680-decef3790f6b\" (UID: \"35f68c5c-870d-448d-a680-decef3790f6b\") " Oct 07 19:28:14 crc kubenswrapper[4825]: I1007 19:28:14.609404 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgj6n\" (UniqueName: \"kubernetes.io/projected/35f68c5c-870d-448d-a680-decef3790f6b-kube-api-access-vgj6n\") pod \"35f68c5c-870d-448d-a680-decef3790f6b\" (UID: \"35f68c5c-870d-448d-a680-decef3790f6b\") " Oct 07 19:28:14 crc kubenswrapper[4825]: I1007 19:28:14.617461 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35f68c5c-870d-448d-a680-decef3790f6b-kube-api-access-vgj6n" (OuterVolumeSpecName: "kube-api-access-vgj6n") pod "35f68c5c-870d-448d-a680-decef3790f6b" (UID: "35f68c5c-870d-448d-a680-decef3790f6b"). InnerVolumeSpecName "kube-api-access-vgj6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:28:14 crc kubenswrapper[4825]: I1007 19:28:14.638801 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35f68c5c-870d-448d-a680-decef3790f6b-inventory" (OuterVolumeSpecName: "inventory") pod "35f68c5c-870d-448d-a680-decef3790f6b" (UID: "35f68c5c-870d-448d-a680-decef3790f6b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:28:14 crc kubenswrapper[4825]: I1007 19:28:14.641975 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35f68c5c-870d-448d-a680-decef3790f6b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "35f68c5c-870d-448d-a680-decef3790f6b" (UID: "35f68c5c-870d-448d-a680-decef3790f6b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:28:14 crc kubenswrapper[4825]: I1007 19:28:14.712762 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgj6n\" (UniqueName: \"kubernetes.io/projected/35f68c5c-870d-448d-a680-decef3790f6b-kube-api-access-vgj6n\") on node \"crc\" DevicePath \"\"" Oct 07 19:28:14 crc kubenswrapper[4825]: I1007 19:28:14.712847 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35f68c5c-870d-448d-a680-decef3790f6b-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 19:28:14 crc kubenswrapper[4825]: I1007 19:28:14.712867 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35f68c5c-870d-448d-a680-decef3790f6b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 19:28:15 crc kubenswrapper[4825]: I1007 19:28:15.028515 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ngshx" event={"ID":"35f68c5c-870d-448d-a680-decef3790f6b","Type":"ContainerDied","Data":"76980a64c75f581fde175972d803cd70fa1dadfd4bcb405484ea47c27072db99"} Oct 07 19:28:15 crc kubenswrapper[4825]: I1007 19:28:15.030318 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76980a64c75f581fde175972d803cd70fa1dadfd4bcb405484ea47c27072db99" Oct 07 19:28:15 crc kubenswrapper[4825]: I1007 19:28:15.028864 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ngshx" Oct 07 19:28:15 crc kubenswrapper[4825]: I1007 19:28:15.136864 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl"] Oct 07 19:28:15 crc kubenswrapper[4825]: E1007 19:28:15.137799 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f68c5c-870d-448d-a680-decef3790f6b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 07 19:28:15 crc kubenswrapper[4825]: I1007 19:28:15.137851 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f68c5c-870d-448d-a680-decef3790f6b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 07 19:28:15 crc kubenswrapper[4825]: I1007 19:28:15.138431 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="35f68c5c-870d-448d-a680-decef3790f6b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 07 19:28:15 crc kubenswrapper[4825]: I1007 19:28:15.139775 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl" Oct 07 19:28:15 crc kubenswrapper[4825]: I1007 19:28:15.144443 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 19:28:15 crc kubenswrapper[4825]: I1007 19:28:15.144780 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lr8sm" Oct 07 19:28:15 crc kubenswrapper[4825]: I1007 19:28:15.144780 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 19:28:15 crc kubenswrapper[4825]: I1007 19:28:15.144944 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 19:28:15 crc kubenswrapper[4825]: I1007 19:28:15.149067 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl"] Oct 07 19:28:15 crc kubenswrapper[4825]: I1007 19:28:15.222736 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2z89\" (UniqueName: \"kubernetes.io/projected/ad194ba9-9675-4a8e-be19-b44964a5b493-kube-api-access-s2z89\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl\" (UID: \"ad194ba9-9675-4a8e-be19-b44964a5b493\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl" Oct 07 19:28:15 crc kubenswrapper[4825]: I1007 19:28:15.223054 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad194ba9-9675-4a8e-be19-b44964a5b493-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl\" (UID: \"ad194ba9-9675-4a8e-be19-b44964a5b493\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl" Oct 07 19:28:15 crc kubenswrapper[4825]: I1007 19:28:15.223372 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad194ba9-9675-4a8e-be19-b44964a5b493-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl\" (UID: \"ad194ba9-9675-4a8e-be19-b44964a5b493\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl" Oct 07 19:28:15 crc kubenswrapper[4825]: I1007 19:28:15.325028 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad194ba9-9675-4a8e-be19-b44964a5b493-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl\" (UID: \"ad194ba9-9675-4a8e-be19-b44964a5b493\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl" Oct 07 19:28:15 crc kubenswrapper[4825]: I1007 19:28:15.325110 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad194ba9-9675-4a8e-be19-b44964a5b493-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl\" (UID: \"ad194ba9-9675-4a8e-be19-b44964a5b493\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl" Oct 07 19:28:15 crc kubenswrapper[4825]: I1007 19:28:15.325218 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2z89\" (UniqueName: \"kubernetes.io/projected/ad194ba9-9675-4a8e-be19-b44964a5b493-kube-api-access-s2z89\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl\" (UID: \"ad194ba9-9675-4a8e-be19-b44964a5b493\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl" Oct 07 19:28:15 crc kubenswrapper[4825]: I1007 19:28:15.328747 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad194ba9-9675-4a8e-be19-b44964a5b493-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl\" (UID: \"ad194ba9-9675-4a8e-be19-b44964a5b493\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl" Oct 07 19:28:15 crc kubenswrapper[4825]: I1007 19:28:15.332160 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad194ba9-9675-4a8e-be19-b44964a5b493-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl\" (UID: \"ad194ba9-9675-4a8e-be19-b44964a5b493\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl" Oct 07 19:28:15 crc kubenswrapper[4825]: I1007 19:28:15.347629 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2z89\" (UniqueName: \"kubernetes.io/projected/ad194ba9-9675-4a8e-be19-b44964a5b493-kube-api-access-s2z89\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl\" (UID: \"ad194ba9-9675-4a8e-be19-b44964a5b493\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl" Oct 07 19:28:15 crc kubenswrapper[4825]: I1007 19:28:15.466470 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl" Oct 07 19:28:16 crc kubenswrapper[4825]: I1007 19:28:16.028978 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl"] Oct 07 19:28:16 crc kubenswrapper[4825]: W1007 19:28:16.033835 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad194ba9_9675_4a8e_be19_b44964a5b493.slice/crio-c38cbf0607976167a0790ef464527444182a2b80554f9af65d44c9a8ede08902 WatchSource:0}: Error finding container c38cbf0607976167a0790ef464527444182a2b80554f9af65d44c9a8ede08902: Status 404 returned error can't find the container with id c38cbf0607976167a0790ef464527444182a2b80554f9af65d44c9a8ede08902 Oct 07 19:28:16 crc kubenswrapper[4825]: I1007 19:28:16.036333 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 19:28:17 crc kubenswrapper[4825]: I1007 19:28:17.050670 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl" event={"ID":"ad194ba9-9675-4a8e-be19-b44964a5b493","Type":"ContainerStarted","Data":"a3543e2f776644a16b913190a9dccf5d01d06fd4d8d04f456fdf4f7ff5218fc6"} Oct 07 19:28:17 crc kubenswrapper[4825]: I1007 19:28:17.050986 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl" event={"ID":"ad194ba9-9675-4a8e-be19-b44964a5b493","Type":"ContainerStarted","Data":"c38cbf0607976167a0790ef464527444182a2b80554f9af65d44c9a8ede08902"} Oct 07 19:28:17 crc kubenswrapper[4825]: I1007 19:28:17.073124 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl" podStartSLOduration=1.895234415 podStartE2EDuration="2.073105895s" podCreationTimestamp="2025-10-07 19:28:15 +0000 UTC" firstStartedPulling="2025-10-07 19:28:16.036066026 +0000 UTC m=+1684.858104673" lastFinishedPulling="2025-10-07 19:28:16.213937506 +0000 UTC m=+1685.035976153" observedRunningTime="2025-10-07 19:28:17.067179417 +0000 UTC m=+1685.889218064" watchObservedRunningTime="2025-10-07 19:28:17.073105895 +0000 UTC m=+1685.895144552" Oct 07 19:28:24 crc kubenswrapper[4825]: I1007 19:28:24.795331 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:28:24 crc kubenswrapper[4825]: E1007 19:28:24.796123 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:28:38 crc kubenswrapper[4825]: I1007 19:28:38.796358 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:28:38 crc kubenswrapper[4825]: E1007 19:28:38.797347 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:28:45 crc kubenswrapper[4825]: I1007 19:28:45.069571 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-89skq"] Oct 07 19:28:45 crc kubenswrapper[4825]: I1007 19:28:45.076216 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89skq" Oct 07 19:28:45 crc kubenswrapper[4825]: I1007 19:28:45.080260 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-89skq"] Oct 07 19:28:45 crc kubenswrapper[4825]: I1007 19:28:45.237272 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4173c521-a771-4afa-b5fb-c30e9260a9cc-utilities\") pod \"redhat-operators-89skq\" (UID: \"4173c521-a771-4afa-b5fb-c30e9260a9cc\") " pod="openshift-marketplace/redhat-operators-89skq" Oct 07 19:28:45 crc kubenswrapper[4825]: I1007 19:28:45.237354 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp6ws\" (UniqueName: \"kubernetes.io/projected/4173c521-a771-4afa-b5fb-c30e9260a9cc-kube-api-access-jp6ws\") pod \"redhat-operators-89skq\" (UID: \"4173c521-a771-4afa-b5fb-c30e9260a9cc\") " pod="openshift-marketplace/redhat-operators-89skq" Oct 07 19:28:45 crc kubenswrapper[4825]: I1007 19:28:45.237505 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4173c521-a771-4afa-b5fb-c30e9260a9cc-catalog-content\") pod \"redhat-operators-89skq\" (UID: \"4173c521-a771-4afa-b5fb-c30e9260a9cc\") " pod="openshift-marketplace/redhat-operators-89skq" Oct 07 19:28:45 crc kubenswrapper[4825]: I1007 19:28:45.355932 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4173c521-a771-4afa-b5fb-c30e9260a9cc-catalog-content\") pod \"redhat-operators-89skq\" (UID: \"4173c521-a771-4afa-b5fb-c30e9260a9cc\") " pod="openshift-marketplace/redhat-operators-89skq" Oct 07 19:28:45 crc kubenswrapper[4825]: I1007 19:28:45.356168 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4173c521-a771-4afa-b5fb-c30e9260a9cc-utilities\") pod \"redhat-operators-89skq\" (UID: \"4173c521-a771-4afa-b5fb-c30e9260a9cc\") " pod="openshift-marketplace/redhat-operators-89skq" Oct 07 19:28:45 crc kubenswrapper[4825]: I1007 19:28:45.356277 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp6ws\" (UniqueName: \"kubernetes.io/projected/4173c521-a771-4afa-b5fb-c30e9260a9cc-kube-api-access-jp6ws\") pod \"redhat-operators-89skq\" (UID: \"4173c521-a771-4afa-b5fb-c30e9260a9cc\") " pod="openshift-marketplace/redhat-operators-89skq" Oct 07 19:28:45 crc kubenswrapper[4825]: I1007 19:28:45.357701 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4173c521-a771-4afa-b5fb-c30e9260a9cc-catalog-content\") pod \"redhat-operators-89skq\" (UID: \"4173c521-a771-4afa-b5fb-c30e9260a9cc\") " pod="openshift-marketplace/redhat-operators-89skq" Oct 07 19:28:45 crc kubenswrapper[4825]: I1007 19:28:45.358158 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4173c521-a771-4afa-b5fb-c30e9260a9cc-utilities\") pod \"redhat-operators-89skq\" (UID: \"4173c521-a771-4afa-b5fb-c30e9260a9cc\") " pod="openshift-marketplace/redhat-operators-89skq" Oct 07 19:28:45 crc kubenswrapper[4825]: I1007 19:28:45.383345 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp6ws\" (UniqueName: \"kubernetes.io/projected/4173c521-a771-4afa-b5fb-c30e9260a9cc-kube-api-access-jp6ws\") pod \"redhat-operators-89skq\" (UID: \"4173c521-a771-4afa-b5fb-c30e9260a9cc\") " pod="openshift-marketplace/redhat-operators-89skq" Oct 07 19:28:45 crc kubenswrapper[4825]: I1007 19:28:45.413012 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89skq" Oct 07 19:28:45 crc kubenswrapper[4825]: I1007 19:28:45.867205 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-89skq"] Oct 07 19:28:46 crc kubenswrapper[4825]: E1007 19:28:46.214325 4825 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4173c521_a771_4afa_b5fb_c30e9260a9cc.slice/crio-36a2166ca35932a6c3c95067535759646f52fe652721ca6376f973aa30f172df.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4173c521_a771_4afa_b5fb_c30e9260a9cc.slice/crio-conmon-36a2166ca35932a6c3c95067535759646f52fe652721ca6376f973aa30f172df.scope\": RecentStats: unable to find data in memory cache]" Oct 07 19:28:46 crc kubenswrapper[4825]: I1007 19:28:46.381919 4825 generic.go:334] "Generic (PLEG): container finished" podID="4173c521-a771-4afa-b5fb-c30e9260a9cc" containerID="36a2166ca35932a6c3c95067535759646f52fe652721ca6376f973aa30f172df" exitCode=0 Oct 07 19:28:46 crc kubenswrapper[4825]: I1007 19:28:46.381983 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89skq" event={"ID":"4173c521-a771-4afa-b5fb-c30e9260a9cc","Type":"ContainerDied","Data":"36a2166ca35932a6c3c95067535759646f52fe652721ca6376f973aa30f172df"} Oct 07 19:28:46 crc kubenswrapper[4825]: I1007 19:28:46.382308 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89skq" event={"ID":"4173c521-a771-4afa-b5fb-c30e9260a9cc","Type":"ContainerStarted","Data":"ff88d3c4a599c7155175def596ee2d36cf29fab23755d9868558653d8e025bc3"} Oct 07 19:28:47 crc kubenswrapper[4825]: I1007 19:28:47.391656 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89skq" event={"ID":"4173c521-a771-4afa-b5fb-c30e9260a9cc","Type":"ContainerStarted","Data":"df73962c64a82cd1299cd033cf4e43f35e7da60d05a8bf074da7e1e9e1bc1904"} Oct 07 19:28:47 crc kubenswrapper[4825]: I1007 19:28:47.495637 4825 scope.go:117] "RemoveContainer" containerID="2db6a6ec54cde59acb7506828b34a2ca3ab8872828ab4031ac5dfafa166331c6" Oct 07 19:28:47 crc kubenswrapper[4825]: I1007 19:28:47.560641 4825 scope.go:117] "RemoveContainer" containerID="614bb105602c607a6a4de778d21c754de7d7b420a89dec2b5999fc9ad9338df3" Oct 07 19:28:47 crc kubenswrapper[4825]: I1007 19:28:47.594314 4825 scope.go:117] "RemoveContainer" containerID="ca0169795172599287bb6168937113feecdb5f27e5899e43d0cab81f82c5f322" Oct 07 19:28:47 crc kubenswrapper[4825]: I1007 19:28:47.641318 4825 scope.go:117] "RemoveContainer" containerID="9ec338039e8e81ace96a3c1e412c1c5618237af373575f561bc1f48885dcfd88" Oct 07 19:28:47 crc kubenswrapper[4825]: I1007 19:28:47.684978 4825 scope.go:117] "RemoveContainer" containerID="4186850b8f8a80925779fceac05d91bf622138245e945fddc1a9f23e64e45757" Oct 07 19:28:47 crc kubenswrapper[4825]: I1007 19:28:47.723174 4825 scope.go:117] "RemoveContainer" containerID="b3e3f43b9a08835f74a4535947ef7941a4bf08570c4c15e8f2014da54c998f04" Oct 07 19:28:47 crc kubenswrapper[4825]: I1007 19:28:47.741502 4825 scope.go:117] "RemoveContainer" containerID="d3c07cff373a2370fbfc0ceac515737fda186b40a16e7c6e0a6be4db38f31f2c" Oct 07 19:28:47 crc kubenswrapper[4825]: I1007 19:28:47.763902 4825 scope.go:117] "RemoveContainer" containerID="b4a7f889fdf763b8db72d95639951da308ba33e2bdd9e991b334942643daadb4" Oct 07 19:28:48 crc kubenswrapper[4825]: I1007 19:28:48.408463 4825 generic.go:334] "Generic (PLEG): container finished" podID="4173c521-a771-4afa-b5fb-c30e9260a9cc" containerID="df73962c64a82cd1299cd033cf4e43f35e7da60d05a8bf074da7e1e9e1bc1904" exitCode=0 Oct 07 19:28:48 crc kubenswrapper[4825]: I1007 19:28:48.408656 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89skq" event={"ID":"4173c521-a771-4afa-b5fb-c30e9260a9cc","Type":"ContainerDied","Data":"df73962c64a82cd1299cd033cf4e43f35e7da60d05a8bf074da7e1e9e1bc1904"} Oct 07 19:28:49 crc kubenswrapper[4825]: I1007 19:28:49.083047 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-cccsz"] Oct 07 19:28:49 crc kubenswrapper[4825]: I1007 19:28:49.096769 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-cccsz"] Oct 07 19:28:49 crc kubenswrapper[4825]: I1007 19:28:49.423039 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89skq" event={"ID":"4173c521-a771-4afa-b5fb-c30e9260a9cc","Type":"ContainerStarted","Data":"acfcb4903a899dd39ed1a2f5a8ce25e3e4b62f43933a1cc9255064de94506153"} Oct 07 19:28:49 crc kubenswrapper[4825]: I1007 19:28:49.453567 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-89skq" podStartSLOduration=1.732969736 podStartE2EDuration="4.453551155s" podCreationTimestamp="2025-10-07 19:28:45 +0000 UTC" firstStartedPulling="2025-10-07 19:28:46.384838538 +0000 UTC m=+1715.206877205" lastFinishedPulling="2025-10-07 19:28:49.105419967 +0000 UTC m=+1717.927458624" observedRunningTime="2025-10-07 19:28:49.442586725 +0000 UTC m=+1718.264625362" watchObservedRunningTime="2025-10-07 19:28:49.453551155 +0000 UTC m=+1718.275589792" Oct 07 19:28:49 crc kubenswrapper[4825]: I1007 19:28:49.806408 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="019abaa0-c821-4f8c-a195-a9ea7bc81f8b" path="/var/lib/kubelet/pods/019abaa0-c821-4f8c-a195-a9ea7bc81f8b/volumes" Oct 07 19:28:53 crc kubenswrapper[4825]: I1007 19:28:53.795278 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:28:53 crc kubenswrapper[4825]: E1007 19:28:53.796137 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:28:55 crc kubenswrapper[4825]: I1007 19:28:55.039363 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-q29ng"] Oct 07 19:28:55 crc kubenswrapper[4825]: I1007 19:28:55.047048 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-tnkcb"] Oct 07 19:28:55 crc kubenswrapper[4825]: I1007 19:28:55.054034 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-8hldz"] Oct 07 19:28:55 crc kubenswrapper[4825]: I1007 19:28:55.062434 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-q29ng"] Oct 07 19:28:55 crc kubenswrapper[4825]: I1007 19:28:55.068538 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-8hldz"] Oct 07 19:28:55 crc kubenswrapper[4825]: I1007 19:28:55.075887 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-tnkcb"] Oct 07 19:28:55 crc kubenswrapper[4825]: I1007 19:28:55.413798 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-89skq" Oct 07 19:28:55 crc kubenswrapper[4825]: I1007 19:28:55.413871 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-89skq" Oct 07 19:28:55 crc kubenswrapper[4825]: I1007 19:28:55.496621 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-89skq" Oct 07 19:28:55 crc kubenswrapper[4825]: I1007 19:28:55.557011 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-89skq" Oct 07 19:28:55 crc kubenswrapper[4825]: I1007 19:28:55.752724 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-89skq"] Oct 07 19:28:55 crc kubenswrapper[4825]: I1007 19:28:55.808280 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1abc8e94-8f1f-4195-b476-248206d004bf" path="/var/lib/kubelet/pods/1abc8e94-8f1f-4195-b476-248206d004bf/volumes" Oct 07 19:28:55 crc kubenswrapper[4825]: I1007 19:28:55.809044 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d9e561b-95fb-4643-8452-01f9ae3475eb" path="/var/lib/kubelet/pods/2d9e561b-95fb-4643-8452-01f9ae3475eb/volumes" Oct 07 19:28:55 crc kubenswrapper[4825]: I1007 19:28:55.810106 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dbca3ac-3960-4572-93c4-04276137f96a" path="/var/lib/kubelet/pods/3dbca3ac-3960-4572-93c4-04276137f96a/volumes" Oct 07 19:28:57 crc kubenswrapper[4825]: I1007 19:28:57.523779 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-89skq" podUID="4173c521-a771-4afa-b5fb-c30e9260a9cc" containerName="registry-server" containerID="cri-o://acfcb4903a899dd39ed1a2f5a8ce25e3e4b62f43933a1cc9255064de94506153" gracePeriod=2 Oct 07 19:28:57 crc kubenswrapper[4825]: I1007 19:28:57.970916 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89skq" Oct 07 19:28:58 crc kubenswrapper[4825]: I1007 19:28:58.073511 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4173c521-a771-4afa-b5fb-c30e9260a9cc-utilities\") pod \"4173c521-a771-4afa-b5fb-c30e9260a9cc\" (UID: \"4173c521-a771-4afa-b5fb-c30e9260a9cc\") " Oct 07 19:28:58 crc kubenswrapper[4825]: I1007 19:28:58.073600 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4173c521-a771-4afa-b5fb-c30e9260a9cc-catalog-content\") pod \"4173c521-a771-4afa-b5fb-c30e9260a9cc\" (UID: \"4173c521-a771-4afa-b5fb-c30e9260a9cc\") " Oct 07 19:28:58 crc kubenswrapper[4825]: I1007 19:28:58.073879 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp6ws\" (UniqueName: \"kubernetes.io/projected/4173c521-a771-4afa-b5fb-c30e9260a9cc-kube-api-access-jp6ws\") pod \"4173c521-a771-4afa-b5fb-c30e9260a9cc\" (UID: \"4173c521-a771-4afa-b5fb-c30e9260a9cc\") " Oct 07 19:28:58 crc kubenswrapper[4825]: I1007 19:28:58.074883 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4173c521-a771-4afa-b5fb-c30e9260a9cc-utilities" (OuterVolumeSpecName: "utilities") pod "4173c521-a771-4afa-b5fb-c30e9260a9cc" (UID: "4173c521-a771-4afa-b5fb-c30e9260a9cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:28:58 crc kubenswrapper[4825]: I1007 19:28:58.083007 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4173c521-a771-4afa-b5fb-c30e9260a9cc-kube-api-access-jp6ws" (OuterVolumeSpecName: "kube-api-access-jp6ws") pod "4173c521-a771-4afa-b5fb-c30e9260a9cc" (UID: "4173c521-a771-4afa-b5fb-c30e9260a9cc"). InnerVolumeSpecName "kube-api-access-jp6ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:28:58 crc kubenswrapper[4825]: I1007 19:28:58.176743 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jp6ws\" (UniqueName: \"kubernetes.io/projected/4173c521-a771-4afa-b5fb-c30e9260a9cc-kube-api-access-jp6ws\") on node \"crc\" DevicePath \"\"" Oct 07 19:28:58 crc kubenswrapper[4825]: I1007 19:28:58.176847 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4173c521-a771-4afa-b5fb-c30e9260a9cc-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:28:58 crc kubenswrapper[4825]: I1007 19:28:58.542161 4825 generic.go:334] "Generic (PLEG): container finished" podID="4173c521-a771-4afa-b5fb-c30e9260a9cc" containerID="acfcb4903a899dd39ed1a2f5a8ce25e3e4b62f43933a1cc9255064de94506153" exitCode=0 Oct 07 19:28:58 crc kubenswrapper[4825]: I1007 19:28:58.542223 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89skq" Oct 07 19:28:58 crc kubenswrapper[4825]: I1007 19:28:58.542265 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89skq" event={"ID":"4173c521-a771-4afa-b5fb-c30e9260a9cc","Type":"ContainerDied","Data":"acfcb4903a899dd39ed1a2f5a8ce25e3e4b62f43933a1cc9255064de94506153"} Oct 07 19:28:58 crc kubenswrapper[4825]: I1007 19:28:58.542306 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89skq" event={"ID":"4173c521-a771-4afa-b5fb-c30e9260a9cc","Type":"ContainerDied","Data":"ff88d3c4a599c7155175def596ee2d36cf29fab23755d9868558653d8e025bc3"} Oct 07 19:28:58 crc kubenswrapper[4825]: I1007 19:28:58.542334 4825 scope.go:117] "RemoveContainer" containerID="acfcb4903a899dd39ed1a2f5a8ce25e3e4b62f43933a1cc9255064de94506153" Oct 07 19:28:58 crc kubenswrapper[4825]: I1007 19:28:58.579805 4825 scope.go:117] "RemoveContainer" containerID="df73962c64a82cd1299cd033cf4e43f35e7da60d05a8bf074da7e1e9e1bc1904" Oct 07 19:28:58 crc kubenswrapper[4825]: I1007 19:28:58.601023 4825 scope.go:117] "RemoveContainer" containerID="36a2166ca35932a6c3c95067535759646f52fe652721ca6376f973aa30f172df" Oct 07 19:28:58 crc kubenswrapper[4825]: I1007 19:28:58.656178 4825 scope.go:117] "RemoveContainer" containerID="acfcb4903a899dd39ed1a2f5a8ce25e3e4b62f43933a1cc9255064de94506153" Oct 07 19:28:58 crc kubenswrapper[4825]: E1007 19:28:58.656732 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acfcb4903a899dd39ed1a2f5a8ce25e3e4b62f43933a1cc9255064de94506153\": container with ID starting with acfcb4903a899dd39ed1a2f5a8ce25e3e4b62f43933a1cc9255064de94506153 not found: ID does not exist" containerID="acfcb4903a899dd39ed1a2f5a8ce25e3e4b62f43933a1cc9255064de94506153" Oct 07 19:28:58 crc kubenswrapper[4825]: I1007 19:28:58.656777 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acfcb4903a899dd39ed1a2f5a8ce25e3e4b62f43933a1cc9255064de94506153"} err="failed to get container status \"acfcb4903a899dd39ed1a2f5a8ce25e3e4b62f43933a1cc9255064de94506153\": rpc error: code = NotFound desc = could not find container \"acfcb4903a899dd39ed1a2f5a8ce25e3e4b62f43933a1cc9255064de94506153\": container with ID starting with acfcb4903a899dd39ed1a2f5a8ce25e3e4b62f43933a1cc9255064de94506153 not found: ID does not exist" Oct 07 19:28:58 crc kubenswrapper[4825]: I1007 19:28:58.656803 4825 scope.go:117] "RemoveContainer" containerID="df73962c64a82cd1299cd033cf4e43f35e7da60d05a8bf074da7e1e9e1bc1904" Oct 07 19:28:58 crc kubenswrapper[4825]: E1007 19:28:58.657740 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df73962c64a82cd1299cd033cf4e43f35e7da60d05a8bf074da7e1e9e1bc1904\": container with ID starting with df73962c64a82cd1299cd033cf4e43f35e7da60d05a8bf074da7e1e9e1bc1904 not found: ID does not exist" containerID="df73962c64a82cd1299cd033cf4e43f35e7da60d05a8bf074da7e1e9e1bc1904" Oct 07 19:28:58 crc kubenswrapper[4825]: I1007 19:28:58.657794 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df73962c64a82cd1299cd033cf4e43f35e7da60d05a8bf074da7e1e9e1bc1904"} err="failed to get container status \"df73962c64a82cd1299cd033cf4e43f35e7da60d05a8bf074da7e1e9e1bc1904\": rpc error: code = NotFound desc = could not find container \"df73962c64a82cd1299cd033cf4e43f35e7da60d05a8bf074da7e1e9e1bc1904\": container with ID starting with df73962c64a82cd1299cd033cf4e43f35e7da60d05a8bf074da7e1e9e1bc1904 not found: ID does not exist" Oct 07 19:28:58 crc kubenswrapper[4825]: I1007 19:28:58.657812 4825 scope.go:117] "RemoveContainer" containerID="36a2166ca35932a6c3c95067535759646f52fe652721ca6376f973aa30f172df" Oct 07 19:28:58 crc kubenswrapper[4825]: E1007 19:28:58.658343 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36a2166ca35932a6c3c95067535759646f52fe652721ca6376f973aa30f172df\": container with ID starting with 36a2166ca35932a6c3c95067535759646f52fe652721ca6376f973aa30f172df not found: ID does not exist" containerID="36a2166ca35932a6c3c95067535759646f52fe652721ca6376f973aa30f172df" Oct 07 19:28:58 crc kubenswrapper[4825]: I1007 19:28:58.658586 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36a2166ca35932a6c3c95067535759646f52fe652721ca6376f973aa30f172df"} err="failed to get container status \"36a2166ca35932a6c3c95067535759646f52fe652721ca6376f973aa30f172df\": rpc error: code = NotFound desc = could not find container \"36a2166ca35932a6c3c95067535759646f52fe652721ca6376f973aa30f172df\": container with ID starting with 36a2166ca35932a6c3c95067535759646f52fe652721ca6376f973aa30f172df not found: ID does not exist" Oct 07 19:28:59 crc kubenswrapper[4825]: I1007 19:28:59.444039 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4173c521-a771-4afa-b5fb-c30e9260a9cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4173c521-a771-4afa-b5fb-c30e9260a9cc" (UID: "4173c521-a771-4afa-b5fb-c30e9260a9cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:28:59 crc kubenswrapper[4825]: I1007 19:28:59.503878 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4173c521-a771-4afa-b5fb-c30e9260a9cc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:28:59 crc kubenswrapper[4825]: I1007 19:28:59.790443 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-89skq"] Oct 07 19:28:59 crc kubenswrapper[4825]: I1007 19:28:59.814292 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-89skq"] Oct 07 19:29:01 crc kubenswrapper[4825]: I1007 19:29:01.813744 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4173c521-a771-4afa-b5fb-c30e9260a9cc" path="/var/lib/kubelet/pods/4173c521-a771-4afa-b5fb-c30e9260a9cc/volumes" Oct 07 19:29:05 crc kubenswrapper[4825]: I1007 19:29:05.798306 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:29:05 crc kubenswrapper[4825]: E1007 19:29:05.799252 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:29:09 crc kubenswrapper[4825]: I1007 19:29:09.061133 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-vlkds"] Oct 07 19:29:09 crc kubenswrapper[4825]: I1007 19:29:09.069010 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-vlkds"] Oct 07 19:29:09 crc kubenswrapper[4825]: I1007 19:29:09.810892 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba465067-0e79-4d52-bc56-a4b60767eb7d" path="/var/lib/kubelet/pods/ba465067-0e79-4d52-bc56-a4b60767eb7d/volumes" Oct 07 19:29:17 crc kubenswrapper[4825]: I1007 19:29:17.795288 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:29:17 crc kubenswrapper[4825]: E1007 19:29:17.796204 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:29:20 crc kubenswrapper[4825]: I1007 19:29:20.814506 4825 generic.go:334] "Generic (PLEG): container finished" podID="ad194ba9-9675-4a8e-be19-b44964a5b493" containerID="a3543e2f776644a16b913190a9dccf5d01d06fd4d8d04f456fdf4f7ff5218fc6" exitCode=0 Oct 07 19:29:20 crc kubenswrapper[4825]: I1007 19:29:20.814811 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl" event={"ID":"ad194ba9-9675-4a8e-be19-b44964a5b493","Type":"ContainerDied","Data":"a3543e2f776644a16b913190a9dccf5d01d06fd4d8d04f456fdf4f7ff5218fc6"} Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.295548 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl" Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.491345 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad194ba9-9675-4a8e-be19-b44964a5b493-ssh-key\") pod \"ad194ba9-9675-4a8e-be19-b44964a5b493\" (UID: \"ad194ba9-9675-4a8e-be19-b44964a5b493\") " Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.491424 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2z89\" (UniqueName: \"kubernetes.io/projected/ad194ba9-9675-4a8e-be19-b44964a5b493-kube-api-access-s2z89\") pod \"ad194ba9-9675-4a8e-be19-b44964a5b493\" (UID: \"ad194ba9-9675-4a8e-be19-b44964a5b493\") " Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.491541 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad194ba9-9675-4a8e-be19-b44964a5b493-inventory\") pod \"ad194ba9-9675-4a8e-be19-b44964a5b493\" (UID: \"ad194ba9-9675-4a8e-be19-b44964a5b493\") " Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.499000 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad194ba9-9675-4a8e-be19-b44964a5b493-kube-api-access-s2z89" (OuterVolumeSpecName: "kube-api-access-s2z89") pod "ad194ba9-9675-4a8e-be19-b44964a5b493" (UID: "ad194ba9-9675-4a8e-be19-b44964a5b493"). InnerVolumeSpecName "kube-api-access-s2z89". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.518170 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad194ba9-9675-4a8e-be19-b44964a5b493-inventory" (OuterVolumeSpecName: "inventory") pod "ad194ba9-9675-4a8e-be19-b44964a5b493" (UID: "ad194ba9-9675-4a8e-be19-b44964a5b493"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.526135 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad194ba9-9675-4a8e-be19-b44964a5b493-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ad194ba9-9675-4a8e-be19-b44964a5b493" (UID: "ad194ba9-9675-4a8e-be19-b44964a5b493"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.594874 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad194ba9-9675-4a8e-be19-b44964a5b493-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.594941 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2z89\" (UniqueName: \"kubernetes.io/projected/ad194ba9-9675-4a8e-be19-b44964a5b493-kube-api-access-s2z89\") on node \"crc\" DevicePath \"\"" Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.594972 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad194ba9-9675-4a8e-be19-b44964a5b493-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.840809 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl" event={"ID":"ad194ba9-9675-4a8e-be19-b44964a5b493","Type":"ContainerDied","Data":"c38cbf0607976167a0790ef464527444182a2b80554f9af65d44c9a8ede08902"} Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.840863 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c38cbf0607976167a0790ef464527444182a2b80554f9af65d44c9a8ede08902" Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.840874 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl" Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.945286 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx"] Oct 07 19:29:22 crc kubenswrapper[4825]: E1007 19:29:22.945625 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4173c521-a771-4afa-b5fb-c30e9260a9cc" containerName="registry-server" Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.945641 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4173c521-a771-4afa-b5fb-c30e9260a9cc" containerName="registry-server" Oct 07 19:29:22 crc kubenswrapper[4825]: E1007 19:29:22.945665 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4173c521-a771-4afa-b5fb-c30e9260a9cc" containerName="extract-utilities" Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.945671 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4173c521-a771-4afa-b5fb-c30e9260a9cc" containerName="extract-utilities" Oct 07 19:29:22 crc kubenswrapper[4825]: E1007 19:29:22.945687 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4173c521-a771-4afa-b5fb-c30e9260a9cc" containerName="extract-content" Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.945693 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4173c521-a771-4afa-b5fb-c30e9260a9cc" containerName="extract-content" Oct 07 19:29:22 crc kubenswrapper[4825]: E1007 19:29:22.945707 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad194ba9-9675-4a8e-be19-b44964a5b493" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.945713 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad194ba9-9675-4a8e-be19-b44964a5b493" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.945901 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad194ba9-9675-4a8e-be19-b44964a5b493" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.945927 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="4173c521-a771-4afa-b5fb-c30e9260a9cc" containerName="registry-server" Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.946548 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx" Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.949278 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.949754 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lr8sm" Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.949950 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.950225 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 19:29:22 crc kubenswrapper[4825]: I1007 19:29:22.960370 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx"] Oct 07 19:29:23 crc kubenswrapper[4825]: I1007 19:29:23.105538 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0c642ee-a887-496b-a212-48601b94af99-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx\" (UID: \"b0c642ee-a887-496b-a212-48601b94af99\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx" Oct 07 19:29:23 crc kubenswrapper[4825]: I1007 19:29:23.105948 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztcs6\" (UniqueName: \"kubernetes.io/projected/b0c642ee-a887-496b-a212-48601b94af99-kube-api-access-ztcs6\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx\" (UID: \"b0c642ee-a887-496b-a212-48601b94af99\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx" Oct 07 19:29:23 crc kubenswrapper[4825]: I1007 19:29:23.106246 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0c642ee-a887-496b-a212-48601b94af99-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx\" (UID: \"b0c642ee-a887-496b-a212-48601b94af99\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx" Oct 07 19:29:23 crc kubenswrapper[4825]: I1007 19:29:23.208812 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0c642ee-a887-496b-a212-48601b94af99-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx\" (UID: \"b0c642ee-a887-496b-a212-48601b94af99\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx" Oct 07 19:29:23 crc kubenswrapper[4825]: I1007 19:29:23.208920 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztcs6\" (UniqueName: \"kubernetes.io/projected/b0c642ee-a887-496b-a212-48601b94af99-kube-api-access-ztcs6\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx\" (UID: \"b0c642ee-a887-496b-a212-48601b94af99\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx" Oct 07 19:29:23 crc kubenswrapper[4825]: I1007 19:29:23.209045 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0c642ee-a887-496b-a212-48601b94af99-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx\" (UID: \"b0c642ee-a887-496b-a212-48601b94af99\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx" Oct 07 19:29:23 crc kubenswrapper[4825]: I1007 19:29:23.213583 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0c642ee-a887-496b-a212-48601b94af99-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx\" (UID: \"b0c642ee-a887-496b-a212-48601b94af99\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx" Oct 07 19:29:23 crc kubenswrapper[4825]: I1007 19:29:23.223086 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0c642ee-a887-496b-a212-48601b94af99-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx\" (UID: \"b0c642ee-a887-496b-a212-48601b94af99\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx" Oct 07 19:29:23 crc kubenswrapper[4825]: I1007 19:29:23.228317 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztcs6\" (UniqueName: \"kubernetes.io/projected/b0c642ee-a887-496b-a212-48601b94af99-kube-api-access-ztcs6\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx\" (UID: \"b0c642ee-a887-496b-a212-48601b94af99\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx" Oct 07 19:29:23 crc kubenswrapper[4825]: I1007 19:29:23.267087 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx" Oct 07 19:29:23 crc kubenswrapper[4825]: I1007 19:29:23.845058 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx"] Oct 07 19:29:24 crc kubenswrapper[4825]: I1007 19:29:24.874258 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx" event={"ID":"b0c642ee-a887-496b-a212-48601b94af99","Type":"ContainerStarted","Data":"662e9b6534a2790d92aff2dca2bb746d191462984b0acd0e7830d26f804ac5ad"} Oct 07 19:29:24 crc kubenswrapper[4825]: I1007 19:29:24.874329 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx" event={"ID":"b0c642ee-a887-496b-a212-48601b94af99","Type":"ContainerStarted","Data":"38151d69872fcab380cc53523e29663708836a06a61a163a09f84bffd7ffedb1"} Oct 07 19:29:28 crc kubenswrapper[4825]: I1007 19:29:28.926558 4825 generic.go:334] "Generic (PLEG): container finished" podID="b0c642ee-a887-496b-a212-48601b94af99" containerID="662e9b6534a2790d92aff2dca2bb746d191462984b0acd0e7830d26f804ac5ad" exitCode=0 Oct 07 19:29:28 crc kubenswrapper[4825]: I1007 19:29:28.926758 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx" event={"ID":"b0c642ee-a887-496b-a212-48601b94af99","Type":"ContainerDied","Data":"662e9b6534a2790d92aff2dca2bb746d191462984b0acd0e7830d26f804ac5ad"} Oct 07 19:29:30 crc kubenswrapper[4825]: I1007 19:29:30.413459 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx" Oct 07 19:29:30 crc kubenswrapper[4825]: I1007 19:29:30.470021 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztcs6\" (UniqueName: \"kubernetes.io/projected/b0c642ee-a887-496b-a212-48601b94af99-kube-api-access-ztcs6\") pod \"b0c642ee-a887-496b-a212-48601b94af99\" (UID: \"b0c642ee-a887-496b-a212-48601b94af99\") " Oct 07 19:29:30 crc kubenswrapper[4825]: I1007 19:29:30.470140 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0c642ee-a887-496b-a212-48601b94af99-inventory\") pod \"b0c642ee-a887-496b-a212-48601b94af99\" (UID: \"b0c642ee-a887-496b-a212-48601b94af99\") " Oct 07 19:29:30 crc kubenswrapper[4825]: I1007 19:29:30.470258 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0c642ee-a887-496b-a212-48601b94af99-ssh-key\") pod \"b0c642ee-a887-496b-a212-48601b94af99\" (UID: \"b0c642ee-a887-496b-a212-48601b94af99\") " Oct 07 19:29:30 crc kubenswrapper[4825]: I1007 19:29:30.478898 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0c642ee-a887-496b-a212-48601b94af99-kube-api-access-ztcs6" (OuterVolumeSpecName: "kube-api-access-ztcs6") pod "b0c642ee-a887-496b-a212-48601b94af99" (UID: "b0c642ee-a887-496b-a212-48601b94af99"). InnerVolumeSpecName "kube-api-access-ztcs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:29:30 crc kubenswrapper[4825]: I1007 19:29:30.510808 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0c642ee-a887-496b-a212-48601b94af99-inventory" (OuterVolumeSpecName: "inventory") pod "b0c642ee-a887-496b-a212-48601b94af99" (UID: "b0c642ee-a887-496b-a212-48601b94af99"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:29:30 crc kubenswrapper[4825]: I1007 19:29:30.530186 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0c642ee-a887-496b-a212-48601b94af99-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b0c642ee-a887-496b-a212-48601b94af99" (UID: "b0c642ee-a887-496b-a212-48601b94af99"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:29:30 crc kubenswrapper[4825]: I1007 19:29:30.571702 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0c642ee-a887-496b-a212-48601b94af99-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 19:29:30 crc kubenswrapper[4825]: I1007 19:29:30.571740 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0c642ee-a887-496b-a212-48601b94af99-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 19:29:30 crc kubenswrapper[4825]: I1007 19:29:30.571756 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztcs6\" (UniqueName: \"kubernetes.io/projected/b0c642ee-a887-496b-a212-48601b94af99-kube-api-access-ztcs6\") on node \"crc\" DevicePath \"\"" Oct 07 19:29:30 crc kubenswrapper[4825]: I1007 19:29:30.795449 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:29:30 crc kubenswrapper[4825]: E1007 19:29:30.795877 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:29:30 crc kubenswrapper[4825]: I1007 19:29:30.961680 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx" event={"ID":"b0c642ee-a887-496b-a212-48601b94af99","Type":"ContainerDied","Data":"38151d69872fcab380cc53523e29663708836a06a61a163a09f84bffd7ffedb1"} Oct 07 19:29:30 crc kubenswrapper[4825]: I1007 19:29:30.961751 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38151d69872fcab380cc53523e29663708836a06a61a163a09f84bffd7ffedb1" Oct 07 19:29:30 crc kubenswrapper[4825]: I1007 19:29:30.961850 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx" Oct 07 19:29:31 crc kubenswrapper[4825]: I1007 19:29:31.050370 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-tb8kn"] Oct 07 19:29:31 crc kubenswrapper[4825]: E1007 19:29:31.051149 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0c642ee-a887-496b-a212-48601b94af99" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 07 19:29:31 crc kubenswrapper[4825]: I1007 19:29:31.051182 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0c642ee-a887-496b-a212-48601b94af99" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 07 19:29:31 crc kubenswrapper[4825]: I1007 19:29:31.051565 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0c642ee-a887-496b-a212-48601b94af99" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 07 19:29:31 crc kubenswrapper[4825]: I1007 19:29:31.052540 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tb8kn" Oct 07 19:29:31 crc kubenswrapper[4825]: I1007 19:29:31.055072 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 19:29:31 crc kubenswrapper[4825]: I1007 19:29:31.055368 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lr8sm" Oct 07 19:29:31 crc kubenswrapper[4825]: I1007 19:29:31.056059 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 19:29:31 crc kubenswrapper[4825]: I1007 19:29:31.056243 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 19:29:31 crc kubenswrapper[4825]: I1007 19:29:31.064142 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-tb8kn"] Oct 07 19:29:31 crc kubenswrapper[4825]: I1007 19:29:31.083080 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2e86406-64eb-4c0c-8f9d-38b2a64ddc48-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tb8kn\" (UID: \"c2e86406-64eb-4c0c-8f9d-38b2a64ddc48\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tb8kn" Oct 07 19:29:31 crc kubenswrapper[4825]: I1007 19:29:31.083149 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2e86406-64eb-4c0c-8f9d-38b2a64ddc48-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tb8kn\" (UID: \"c2e86406-64eb-4c0c-8f9d-38b2a64ddc48\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tb8kn" Oct 07 19:29:31 crc kubenswrapper[4825]: I1007 19:29:31.083184 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdf2t\" (UniqueName: \"kubernetes.io/projected/c2e86406-64eb-4c0c-8f9d-38b2a64ddc48-kube-api-access-wdf2t\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tb8kn\" (UID: \"c2e86406-64eb-4c0c-8f9d-38b2a64ddc48\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tb8kn" Oct 07 19:29:31 crc kubenswrapper[4825]: I1007 19:29:31.184552 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2e86406-64eb-4c0c-8f9d-38b2a64ddc48-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tb8kn\" (UID: \"c2e86406-64eb-4c0c-8f9d-38b2a64ddc48\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tb8kn" Oct 07 19:29:31 crc kubenswrapper[4825]: I1007 19:29:31.184614 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2e86406-64eb-4c0c-8f9d-38b2a64ddc48-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tb8kn\" (UID: \"c2e86406-64eb-4c0c-8f9d-38b2a64ddc48\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tb8kn" Oct 07 19:29:31 crc kubenswrapper[4825]: I1007 19:29:31.184652 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdf2t\" (UniqueName: \"kubernetes.io/projected/c2e86406-64eb-4c0c-8f9d-38b2a64ddc48-kube-api-access-wdf2t\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tb8kn\" (UID: \"c2e86406-64eb-4c0c-8f9d-38b2a64ddc48\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tb8kn" Oct 07 19:29:31 crc kubenswrapper[4825]: I1007 19:29:31.191654 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2e86406-64eb-4c0c-8f9d-38b2a64ddc48-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tb8kn\" (UID: \"c2e86406-64eb-4c0c-8f9d-38b2a64ddc48\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tb8kn" Oct 07 19:29:31 crc kubenswrapper[4825]: I1007 19:29:31.191895 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2e86406-64eb-4c0c-8f9d-38b2a64ddc48-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tb8kn\" (UID: \"c2e86406-64eb-4c0c-8f9d-38b2a64ddc48\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tb8kn" Oct 07 19:29:31 crc kubenswrapper[4825]: I1007 19:29:31.205110 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdf2t\" (UniqueName: \"kubernetes.io/projected/c2e86406-64eb-4c0c-8f9d-38b2a64ddc48-kube-api-access-wdf2t\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tb8kn\" (UID: \"c2e86406-64eb-4c0c-8f9d-38b2a64ddc48\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tb8kn" Oct 07 19:29:31 crc kubenswrapper[4825]: I1007 19:29:31.382929 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tb8kn" Oct 07 19:29:31 crc kubenswrapper[4825]: I1007 19:29:31.986614 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-tb8kn"] Oct 07 19:29:31 crc kubenswrapper[4825]: W1007 19:29:31.990298 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2e86406_64eb_4c0c_8f9d_38b2a64ddc48.slice/crio-6dc92b50c83de7902289464433fe1af50e5973691a1947215536e5767bffe457 WatchSource:0}: Error finding container 6dc92b50c83de7902289464433fe1af50e5973691a1947215536e5767bffe457: Status 404 returned error can't find the container with id 6dc92b50c83de7902289464433fe1af50e5973691a1947215536e5767bffe457 Oct 07 19:29:32 crc kubenswrapper[4825]: I1007 19:29:32.985292 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tb8kn" event={"ID":"c2e86406-64eb-4c0c-8f9d-38b2a64ddc48","Type":"ContainerStarted","Data":"a08514e2701d7f5f98b6bc085e3fde0128e4c1aa12117e68ab45070e6521db0e"} Oct 07 19:29:32 crc kubenswrapper[4825]: I1007 19:29:32.985689 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tb8kn" event={"ID":"c2e86406-64eb-4c0c-8f9d-38b2a64ddc48","Type":"ContainerStarted","Data":"6dc92b50c83de7902289464433fe1af50e5973691a1947215536e5767bffe457"} Oct 07 19:29:33 crc kubenswrapper[4825]: I1007 19:29:33.025541 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tb8kn" podStartSLOduration=1.85692733 podStartE2EDuration="2.025508757s" podCreationTimestamp="2025-10-07 19:29:31 +0000 UTC" firstStartedPulling="2025-10-07 19:29:31.993525811 +0000 UTC m=+1760.815564458" lastFinishedPulling="2025-10-07 19:29:32.162107218 +0000 UTC m=+1760.984145885" observedRunningTime="2025-10-07 19:29:33.010996708 +0000 UTC m=+1761.833035385" watchObservedRunningTime="2025-10-07 19:29:33.025508757 +0000 UTC m=+1761.847547424" Oct 07 19:29:45 crc kubenswrapper[4825]: I1007 19:29:45.795696 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:29:45 crc kubenswrapper[4825]: E1007 19:29:45.797141 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:29:47 crc kubenswrapper[4825]: I1007 19:29:47.969610 4825 scope.go:117] "RemoveContainer" containerID="e2453075a44cf7eb404e3853642aeab5fe6820e2fbc4aed966e4b20a5ca29edd" Oct 07 19:29:48 crc kubenswrapper[4825]: I1007 19:29:48.025803 4825 scope.go:117] "RemoveContainer" containerID="f8e9a2c8b3a4847f51327cedb19b1b2435b747f56dfcd88d3b2654273e64326d" Oct 07 19:29:48 crc kubenswrapper[4825]: I1007 19:29:48.079128 4825 scope.go:117] "RemoveContainer" containerID="5468c16d9290fd9e1ab84b25c93a6102553ae19cdd636995b2fe7c94302ed5b1" Oct 07 19:29:48 crc kubenswrapper[4825]: I1007 19:29:48.168527 4825 scope.go:117] "RemoveContainer" containerID="9303478f08a4b2c0ee54d55314c59e8480daf1fe22eea067e520f7bdce9d2beb" Oct 07 19:29:48 crc kubenswrapper[4825]: I1007 19:29:48.198965 4825 scope.go:117] "RemoveContainer" containerID="b3983c1456971ed1e8946e43305af7caa014da109e1daed3fd8bc3d89db94d6b" Oct 07 19:29:51 crc kubenswrapper[4825]: I1007 19:29:51.043667 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-rqhll"] Oct 07 19:29:51 crc kubenswrapper[4825]: I1007 19:29:51.052834 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-rqhll"] Oct 07 19:29:51 crc kubenswrapper[4825]: I1007 19:29:51.061697 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-2lwlm"] Oct 07 19:29:51 crc kubenswrapper[4825]: I1007 19:29:51.070169 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-vnpfn"] Oct 07 19:29:51 crc kubenswrapper[4825]: I1007 19:29:51.076498 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-vnpfn"] Oct 07 19:29:51 crc kubenswrapper[4825]: I1007 19:29:51.082474 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-2lwlm"] Oct 07 19:29:51 crc kubenswrapper[4825]: I1007 19:29:51.811829 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="722d9cc6-daa7-4ca6-b795-93734a5d3c3c" path="/var/lib/kubelet/pods/722d9cc6-daa7-4ca6-b795-93734a5d3c3c/volumes" Oct 07 19:29:51 crc kubenswrapper[4825]: I1007 19:29:51.812882 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9711a494-299c-48ba-9ec3-6cc8e2ede8f3" path="/var/lib/kubelet/pods/9711a494-299c-48ba-9ec3-6cc8e2ede8f3/volumes" Oct 07 19:29:51 crc kubenswrapper[4825]: I1007 19:29:51.813984 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd8a68bd-3d55-4267-b003-773c5444996f" path="/var/lib/kubelet/pods/fd8a68bd-3d55-4267-b003-773c5444996f/volumes" Oct 07 19:30:00 crc kubenswrapper[4825]: I1007 19:30:00.050111 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-e266-account-create-crc5b"] Oct 07 19:30:00 crc kubenswrapper[4825]: I1007 19:30:00.061705 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-e266-account-create-crc5b"] Oct 07 19:30:00 crc kubenswrapper[4825]: I1007 19:30:00.157118 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331090-7shxp"] Oct 07 19:30:00 crc kubenswrapper[4825]: I1007 19:30:00.158656 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331090-7shxp" Oct 07 19:30:00 crc kubenswrapper[4825]: I1007 19:30:00.162114 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 19:30:00 crc kubenswrapper[4825]: I1007 19:30:00.166702 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 19:30:00 crc kubenswrapper[4825]: I1007 19:30:00.171553 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331090-7shxp"] Oct 07 19:30:00 crc kubenswrapper[4825]: I1007 19:30:00.238449 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88bb94c8-6b57-4a05-9c6e-5c169fb26bb6-config-volume\") pod \"collect-profiles-29331090-7shxp\" (UID: \"88bb94c8-6b57-4a05-9c6e-5c169fb26bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331090-7shxp" Oct 07 19:30:00 crc kubenswrapper[4825]: I1007 19:30:00.238806 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88bb94c8-6b57-4a05-9c6e-5c169fb26bb6-secret-volume\") pod \"collect-profiles-29331090-7shxp\" (UID: \"88bb94c8-6b57-4a05-9c6e-5c169fb26bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331090-7shxp" Oct 07 19:30:00 crc kubenswrapper[4825]: I1007 19:30:00.238967 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzvbd\" (UniqueName: \"kubernetes.io/projected/88bb94c8-6b57-4a05-9c6e-5c169fb26bb6-kube-api-access-hzvbd\") pod \"collect-profiles-29331090-7shxp\" (UID: \"88bb94c8-6b57-4a05-9c6e-5c169fb26bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331090-7shxp" Oct 07 19:30:00 crc kubenswrapper[4825]: I1007 19:30:00.340407 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88bb94c8-6b57-4a05-9c6e-5c169fb26bb6-config-volume\") pod \"collect-profiles-29331090-7shxp\" (UID: \"88bb94c8-6b57-4a05-9c6e-5c169fb26bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331090-7shxp" Oct 07 19:30:00 crc kubenswrapper[4825]: I1007 19:30:00.340516 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88bb94c8-6b57-4a05-9c6e-5c169fb26bb6-secret-volume\") pod \"collect-profiles-29331090-7shxp\" (UID: \"88bb94c8-6b57-4a05-9c6e-5c169fb26bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331090-7shxp" Oct 07 19:30:00 crc kubenswrapper[4825]: I1007 19:30:00.340566 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzvbd\" (UniqueName: \"kubernetes.io/projected/88bb94c8-6b57-4a05-9c6e-5c169fb26bb6-kube-api-access-hzvbd\") pod \"collect-profiles-29331090-7shxp\" (UID: \"88bb94c8-6b57-4a05-9c6e-5c169fb26bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331090-7shxp" Oct 07 19:30:00 crc kubenswrapper[4825]: I1007 19:30:00.341373 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88bb94c8-6b57-4a05-9c6e-5c169fb26bb6-config-volume\") pod \"collect-profiles-29331090-7shxp\" (UID: \"88bb94c8-6b57-4a05-9c6e-5c169fb26bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331090-7shxp" Oct 07 19:30:00 crc kubenswrapper[4825]: I1007 19:30:00.347655 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88bb94c8-6b57-4a05-9c6e-5c169fb26bb6-secret-volume\") pod \"collect-profiles-29331090-7shxp\" (UID: \"88bb94c8-6b57-4a05-9c6e-5c169fb26bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331090-7shxp" Oct 07 19:30:00 crc kubenswrapper[4825]: I1007 19:30:00.357655 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzvbd\" (UniqueName: \"kubernetes.io/projected/88bb94c8-6b57-4a05-9c6e-5c169fb26bb6-kube-api-access-hzvbd\") pod \"collect-profiles-29331090-7shxp\" (UID: \"88bb94c8-6b57-4a05-9c6e-5c169fb26bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331090-7shxp" Oct 07 19:30:00 crc kubenswrapper[4825]: I1007 19:30:00.484005 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331090-7shxp" Oct 07 19:30:00 crc kubenswrapper[4825]: I1007 19:30:00.795428 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:30:00 crc kubenswrapper[4825]: E1007 19:30:00.796848 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:30:00 crc kubenswrapper[4825]: I1007 19:30:00.979216 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331090-7shxp"] Oct 07 19:30:00 crc kubenswrapper[4825]: W1007 19:30:00.985128 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88bb94c8_6b57_4a05_9c6e_5c169fb26bb6.slice/crio-076c2ad4045a5632afaba35c7ee62c22cadc01be9b3fbbaff81300462a8f4e5a WatchSource:0}: Error finding container 076c2ad4045a5632afaba35c7ee62c22cadc01be9b3fbbaff81300462a8f4e5a: Status 404 returned error can't find the container with id 076c2ad4045a5632afaba35c7ee62c22cadc01be9b3fbbaff81300462a8f4e5a Oct 07 19:30:01 crc kubenswrapper[4825]: I1007 19:30:01.281070 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331090-7shxp" event={"ID":"88bb94c8-6b57-4a05-9c6e-5c169fb26bb6","Type":"ContainerStarted","Data":"eb4815e3e3379df085a287d82a218e1c4243ec08487ed754b4736cb8ebf47665"} Oct 07 19:30:01 crc kubenswrapper[4825]: I1007 19:30:01.281114 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331090-7shxp" event={"ID":"88bb94c8-6b57-4a05-9c6e-5c169fb26bb6","Type":"ContainerStarted","Data":"076c2ad4045a5632afaba35c7ee62c22cadc01be9b3fbbaff81300462a8f4e5a"} Oct 07 19:30:01 crc kubenswrapper[4825]: I1007 19:30:01.305132 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29331090-7shxp" podStartSLOduration=1.305111341 podStartE2EDuration="1.305111341s" podCreationTimestamp="2025-10-07 19:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 19:30:01.298850139 +0000 UTC m=+1790.120888796" watchObservedRunningTime="2025-10-07 19:30:01.305111341 +0000 UTC m=+1790.127149978" Oct 07 19:30:01 crc kubenswrapper[4825]: I1007 19:30:01.809270 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6310f91-ae90-4fd6-a4a5-51a43304420a" path="/var/lib/kubelet/pods/d6310f91-ae90-4fd6-a4a5-51a43304420a/volumes" Oct 07 19:30:02 crc kubenswrapper[4825]: I1007 19:30:02.042865 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-f3e1-account-create-psn2x"] Oct 07 19:30:02 crc kubenswrapper[4825]: I1007 19:30:02.052927 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-d2e4-account-create-6vqxn"] Oct 07 19:30:02 crc kubenswrapper[4825]: I1007 19:30:02.061604 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-f3e1-account-create-psn2x"] Oct 07 19:30:02 crc kubenswrapper[4825]: I1007 19:30:02.070200 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-d2e4-account-create-6vqxn"] Oct 07 19:30:02 crc kubenswrapper[4825]: I1007 19:30:02.293812 4825 generic.go:334] "Generic (PLEG): container finished" podID="88bb94c8-6b57-4a05-9c6e-5c169fb26bb6" containerID="eb4815e3e3379df085a287d82a218e1c4243ec08487ed754b4736cb8ebf47665" exitCode=0 Oct 07 19:30:02 crc kubenswrapper[4825]: I1007 19:30:02.293852 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331090-7shxp" event={"ID":"88bb94c8-6b57-4a05-9c6e-5c169fb26bb6","Type":"ContainerDied","Data":"eb4815e3e3379df085a287d82a218e1c4243ec08487ed754b4736cb8ebf47665"} Oct 07 19:30:03 crc kubenswrapper[4825]: I1007 19:30:03.681690 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331090-7shxp" Oct 07 19:30:03 crc kubenswrapper[4825]: I1007 19:30:03.709609 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88bb94c8-6b57-4a05-9c6e-5c169fb26bb6-config-volume\") pod \"88bb94c8-6b57-4a05-9c6e-5c169fb26bb6\" (UID: \"88bb94c8-6b57-4a05-9c6e-5c169fb26bb6\") " Oct 07 19:30:03 crc kubenswrapper[4825]: I1007 19:30:03.709797 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88bb94c8-6b57-4a05-9c6e-5c169fb26bb6-secret-volume\") pod \"88bb94c8-6b57-4a05-9c6e-5c169fb26bb6\" (UID: \"88bb94c8-6b57-4a05-9c6e-5c169fb26bb6\") " Oct 07 19:30:03 crc kubenswrapper[4825]: I1007 19:30:03.709874 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzvbd\" (UniqueName: \"kubernetes.io/projected/88bb94c8-6b57-4a05-9c6e-5c169fb26bb6-kube-api-access-hzvbd\") pod \"88bb94c8-6b57-4a05-9c6e-5c169fb26bb6\" (UID: \"88bb94c8-6b57-4a05-9c6e-5c169fb26bb6\") " Oct 07 19:30:03 crc kubenswrapper[4825]: I1007 19:30:03.710688 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88bb94c8-6b57-4a05-9c6e-5c169fb26bb6-config-volume" (OuterVolumeSpecName: "config-volume") pod "88bb94c8-6b57-4a05-9c6e-5c169fb26bb6" (UID: "88bb94c8-6b57-4a05-9c6e-5c169fb26bb6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:30:03 crc kubenswrapper[4825]: I1007 19:30:03.722504 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88bb94c8-6b57-4a05-9c6e-5c169fb26bb6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "88bb94c8-6b57-4a05-9c6e-5c169fb26bb6" (UID: "88bb94c8-6b57-4a05-9c6e-5c169fb26bb6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:30:03 crc kubenswrapper[4825]: I1007 19:30:03.730146 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88bb94c8-6b57-4a05-9c6e-5c169fb26bb6-kube-api-access-hzvbd" (OuterVolumeSpecName: "kube-api-access-hzvbd") pod "88bb94c8-6b57-4a05-9c6e-5c169fb26bb6" (UID: "88bb94c8-6b57-4a05-9c6e-5c169fb26bb6"). InnerVolumeSpecName "kube-api-access-hzvbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:30:03 crc kubenswrapper[4825]: I1007 19:30:03.812556 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50aee8af-3586-4ace-9c5a-4046a5af52e1" path="/var/lib/kubelet/pods/50aee8af-3586-4ace-9c5a-4046a5af52e1/volumes" Oct 07 19:30:03 crc kubenswrapper[4825]: I1007 19:30:03.813709 4825 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88bb94c8-6b57-4a05-9c6e-5c169fb26bb6-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 19:30:03 crc kubenswrapper[4825]: I1007 19:30:03.813743 4825 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88bb94c8-6b57-4a05-9c6e-5c169fb26bb6-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 19:30:03 crc kubenswrapper[4825]: I1007 19:30:03.813763 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzvbd\" (UniqueName: \"kubernetes.io/projected/88bb94c8-6b57-4a05-9c6e-5c169fb26bb6-kube-api-access-hzvbd\") on node \"crc\" DevicePath \"\"" Oct 07 19:30:03 crc kubenswrapper[4825]: I1007 19:30:03.816153 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="676ed7ae-5a2c-4680-9138-81a1cf620b00" path="/var/lib/kubelet/pods/676ed7ae-5a2c-4680-9138-81a1cf620b00/volumes" Oct 07 19:30:04 crc kubenswrapper[4825]: I1007 19:30:04.318566 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331090-7shxp" event={"ID":"88bb94c8-6b57-4a05-9c6e-5c169fb26bb6","Type":"ContainerDied","Data":"076c2ad4045a5632afaba35c7ee62c22cadc01be9b3fbbaff81300462a8f4e5a"} Oct 07 19:30:04 crc kubenswrapper[4825]: I1007 19:30:04.318610 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="076c2ad4045a5632afaba35c7ee62c22cadc01be9b3fbbaff81300462a8f4e5a" Oct 07 19:30:04 crc kubenswrapper[4825]: I1007 19:30:04.318662 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331090-7shxp" Oct 07 19:30:08 crc kubenswrapper[4825]: I1007 19:30:08.354367 4825 generic.go:334] "Generic (PLEG): container finished" podID="c2e86406-64eb-4c0c-8f9d-38b2a64ddc48" containerID="a08514e2701d7f5f98b6bc085e3fde0128e4c1aa12117e68ab45070e6521db0e" exitCode=0 Oct 07 19:30:08 crc kubenswrapper[4825]: I1007 19:30:08.354456 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tb8kn" event={"ID":"c2e86406-64eb-4c0c-8f9d-38b2a64ddc48","Type":"ContainerDied","Data":"a08514e2701d7f5f98b6bc085e3fde0128e4c1aa12117e68ab45070e6521db0e"} Oct 07 19:30:09 crc kubenswrapper[4825]: I1007 19:30:09.859583 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tb8kn" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.031069 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdf2t\" (UniqueName: \"kubernetes.io/projected/c2e86406-64eb-4c0c-8f9d-38b2a64ddc48-kube-api-access-wdf2t\") pod \"c2e86406-64eb-4c0c-8f9d-38b2a64ddc48\" (UID: \"c2e86406-64eb-4c0c-8f9d-38b2a64ddc48\") " Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.031196 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2e86406-64eb-4c0c-8f9d-38b2a64ddc48-ssh-key\") pod \"c2e86406-64eb-4c0c-8f9d-38b2a64ddc48\" (UID: \"c2e86406-64eb-4c0c-8f9d-38b2a64ddc48\") " Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.031252 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2e86406-64eb-4c0c-8f9d-38b2a64ddc48-inventory\") pod \"c2e86406-64eb-4c0c-8f9d-38b2a64ddc48\" (UID: \"c2e86406-64eb-4c0c-8f9d-38b2a64ddc48\") " Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.038346 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2e86406-64eb-4c0c-8f9d-38b2a64ddc48-kube-api-access-wdf2t" (OuterVolumeSpecName: "kube-api-access-wdf2t") pod "c2e86406-64eb-4c0c-8f9d-38b2a64ddc48" (UID: "c2e86406-64eb-4c0c-8f9d-38b2a64ddc48"). InnerVolumeSpecName "kube-api-access-wdf2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.080479 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2e86406-64eb-4c0c-8f9d-38b2a64ddc48-inventory" (OuterVolumeSpecName: "inventory") pod "c2e86406-64eb-4c0c-8f9d-38b2a64ddc48" (UID: "c2e86406-64eb-4c0c-8f9d-38b2a64ddc48"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.091380 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2e86406-64eb-4c0c-8f9d-38b2a64ddc48-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c2e86406-64eb-4c0c-8f9d-38b2a64ddc48" (UID: "c2e86406-64eb-4c0c-8f9d-38b2a64ddc48"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.134020 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2e86406-64eb-4c0c-8f9d-38b2a64ddc48-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.134056 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdf2t\" (UniqueName: \"kubernetes.io/projected/c2e86406-64eb-4c0c-8f9d-38b2a64ddc48-kube-api-access-wdf2t\") on node \"crc\" DevicePath \"\"" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.134070 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2e86406-64eb-4c0c-8f9d-38b2a64ddc48-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.374360 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tb8kn" event={"ID":"c2e86406-64eb-4c0c-8f9d-38b2a64ddc48","Type":"ContainerDied","Data":"6dc92b50c83de7902289464433fe1af50e5973691a1947215536e5767bffe457"} Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.374430 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dc92b50c83de7902289464433fe1af50e5973691a1947215536e5767bffe457" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.374461 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tb8kn" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.478644 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn"] Oct 07 19:30:10 crc kubenswrapper[4825]: E1007 19:30:10.479411 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88bb94c8-6b57-4a05-9c6e-5c169fb26bb6" containerName="collect-profiles" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.479502 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="88bb94c8-6b57-4a05-9c6e-5c169fb26bb6" containerName="collect-profiles" Oct 07 19:30:10 crc kubenswrapper[4825]: E1007 19:30:10.479583 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2e86406-64eb-4c0c-8f9d-38b2a64ddc48" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.479648 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2e86406-64eb-4c0c-8f9d-38b2a64ddc48" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.479932 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="88bb94c8-6b57-4a05-9c6e-5c169fb26bb6" containerName="collect-profiles" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.480012 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2e86406-64eb-4c0c-8f9d-38b2a64ddc48" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.480830 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.483947 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.484040 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lr8sm" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.484574 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.485363 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.502351 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn"] Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.643703 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn\" (UID: \"da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.643959 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn\" (UID: \"da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.644148 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k8c6\" (UniqueName: \"kubernetes.io/projected/da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c-kube-api-access-4k8c6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn\" (UID: \"da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.745422 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k8c6\" (UniqueName: \"kubernetes.io/projected/da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c-kube-api-access-4k8c6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn\" (UID: \"da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.745498 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn\" (UID: \"da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.745531 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn\" (UID: \"da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.750838 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn\" (UID: \"da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.758279 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn\" (UID: \"da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.772787 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k8c6\" (UniqueName: \"kubernetes.io/projected/da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c-kube-api-access-4k8c6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn\" (UID: \"da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn" Oct 07 19:30:10 crc kubenswrapper[4825]: I1007 19:30:10.810853 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn" Oct 07 19:30:11 crc kubenswrapper[4825]: W1007 19:30:11.364417 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda7f120f_3b67_4abe_a9b9_c2d2b5ce6b0c.slice/crio-729e83ef62d3b3f4e4e7c24625544dfc355fde26bccc468fb89346d98685af9a WatchSource:0}: Error finding container 729e83ef62d3b3f4e4e7c24625544dfc355fde26bccc468fb89346d98685af9a: Status 404 returned error can't find the container with id 729e83ef62d3b3f4e4e7c24625544dfc355fde26bccc468fb89346d98685af9a Oct 07 19:30:11 crc kubenswrapper[4825]: I1007 19:30:11.368812 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn"] Oct 07 19:30:11 crc kubenswrapper[4825]: I1007 19:30:11.388035 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn" event={"ID":"da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c","Type":"ContainerStarted","Data":"729e83ef62d3b3f4e4e7c24625544dfc355fde26bccc468fb89346d98685af9a"} Oct 07 19:30:12 crc kubenswrapper[4825]: I1007 19:30:12.399954 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn" event={"ID":"da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c","Type":"ContainerStarted","Data":"35dff83b305bdc83ea09083fc04441f81fed2b79a30d8531b025bbc8857a97bc"} Oct 07 19:30:12 crc kubenswrapper[4825]: I1007 19:30:12.416500 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn" podStartSLOduration=2.277783962 podStartE2EDuration="2.416478491s" podCreationTimestamp="2025-10-07 19:30:10 +0000 UTC" firstStartedPulling="2025-10-07 19:30:11.370317226 +0000 UTC m=+1800.192355873" lastFinishedPulling="2025-10-07 19:30:11.509011765 +0000 UTC m=+1800.331050402" observedRunningTime="2025-10-07 19:30:12.415998236 +0000 UTC m=+1801.238036873" watchObservedRunningTime="2025-10-07 19:30:12.416478491 +0000 UTC m=+1801.238517128" Oct 07 19:30:14 crc kubenswrapper[4825]: I1007 19:30:14.796003 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:30:14 crc kubenswrapper[4825]: E1007 19:30:14.796622 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:30:25 crc kubenswrapper[4825]: I1007 19:30:25.795087 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:30:25 crc kubenswrapper[4825]: E1007 19:30:25.796007 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:30:26 crc kubenswrapper[4825]: I1007 19:30:26.033478 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cxfw9"] Oct 07 19:30:26 crc kubenswrapper[4825]: I1007 19:30:26.042047 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cxfw9"] Oct 07 19:30:27 crc kubenswrapper[4825]: I1007 19:30:27.813743 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88" path="/var/lib/kubelet/pods/b2ed8fa7-e58f-40e9-ab2a-c6fa11c57d88/volumes" Oct 07 19:30:40 crc kubenswrapper[4825]: I1007 19:30:40.795996 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:30:40 crc kubenswrapper[4825]: E1007 19:30:40.796791 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:30:48 crc kubenswrapper[4825]: I1007 19:30:48.043765 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-4nvlv"] Oct 07 19:30:48 crc kubenswrapper[4825]: I1007 19:30:48.054470 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-4nvlv"] Oct 07 19:30:48 crc kubenswrapper[4825]: I1007 19:30:48.372269 4825 scope.go:117] "RemoveContainer" containerID="e02641bc6aef60408675db5ace8a39d6fb928a411b49e0d00102258fa8b5d35a" Oct 07 19:30:48 crc kubenswrapper[4825]: I1007 19:30:48.443383 4825 scope.go:117] "RemoveContainer" containerID="48c4c70bacf7a002e12f9e617d56bd1e5679a77d86655fa4831c360b8b9dee78" Oct 07 19:30:48 crc kubenswrapper[4825]: I1007 19:30:48.485820 4825 scope.go:117] "RemoveContainer" containerID="482db4f395495ed28699f29794a9f3ab54417a6cac5a29d96e47fed53f20bc66" Oct 07 19:30:48 crc kubenswrapper[4825]: I1007 19:30:48.547970 4825 scope.go:117] "RemoveContainer" containerID="ae50acada5639c5d54ea3a51c1c5c4d876d000e9d74ecd86edfdf24796c4429d" Oct 07 19:30:48 crc kubenswrapper[4825]: I1007 19:30:48.581901 4825 scope.go:117] "RemoveContainer" containerID="10854187a6a87534d43ea89325e40bfdddb3e63a16f27ce9866afcdbaed1237d" Oct 07 19:30:48 crc kubenswrapper[4825]: I1007 19:30:48.635289 4825 scope.go:117] "RemoveContainer" containerID="2fdc77b1ad585ca5f7b1429c1cc480627b46203aa0e26b61e4eada4faaef1da7" Oct 07 19:30:48 crc kubenswrapper[4825]: I1007 19:30:48.677588 4825 scope.go:117] "RemoveContainer" containerID="53f77e7aa53f7dceda6d19d9e57c19451db12591126ef950e7405e4496863975" Oct 07 19:30:49 crc kubenswrapper[4825]: I1007 19:30:49.036322 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ntw92"] Oct 07 19:30:49 crc kubenswrapper[4825]: I1007 19:30:49.044243 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ntw92"] Oct 07 19:30:49 crc kubenswrapper[4825]: I1007 19:30:49.809898 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67d7cb41-3e2e-4583-8552-665f52d70bc7" path="/var/lib/kubelet/pods/67d7cb41-3e2e-4583-8552-665f52d70bc7/volumes" Oct 07 19:30:49 crc kubenswrapper[4825]: I1007 19:30:49.811878 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e83137b5-f576-4a66-967b-ccdef3af6897" path="/var/lib/kubelet/pods/e83137b5-f576-4a66-967b-ccdef3af6897/volumes" Oct 07 19:30:52 crc kubenswrapper[4825]: I1007 19:30:52.797517 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:30:52 crc kubenswrapper[4825]: E1007 19:30:52.798671 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:31:03 crc kubenswrapper[4825]: I1007 19:31:03.795363 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:31:03 crc kubenswrapper[4825]: E1007 19:31:03.796306 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:31:06 crc kubenswrapper[4825]: I1007 19:31:06.983626 4825 generic.go:334] "Generic (PLEG): container finished" podID="da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c" containerID="35dff83b305bdc83ea09083fc04441f81fed2b79a30d8531b025bbc8857a97bc" exitCode=2 Oct 07 19:31:06 crc kubenswrapper[4825]: I1007 19:31:06.983728 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn" event={"ID":"da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c","Type":"ContainerDied","Data":"35dff83b305bdc83ea09083fc04441f81fed2b79a30d8531b025bbc8857a97bc"} Oct 07 19:31:08 crc kubenswrapper[4825]: I1007 19:31:08.432847 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn" Oct 07 19:31:08 crc kubenswrapper[4825]: I1007 19:31:08.528434 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4k8c6\" (UniqueName: \"kubernetes.io/projected/da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c-kube-api-access-4k8c6\") pod \"da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c\" (UID: \"da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c\") " Oct 07 19:31:08 crc kubenswrapper[4825]: I1007 19:31:08.528636 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c-ssh-key\") pod \"da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c\" (UID: \"da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c\") " Oct 07 19:31:08 crc kubenswrapper[4825]: I1007 19:31:08.528721 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c-inventory\") pod \"da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c\" (UID: \"da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c\") " Oct 07 19:31:08 crc kubenswrapper[4825]: I1007 19:31:08.534944 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c-kube-api-access-4k8c6" (OuterVolumeSpecName: "kube-api-access-4k8c6") pod "da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c" (UID: "da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c"). InnerVolumeSpecName "kube-api-access-4k8c6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:31:08 crc kubenswrapper[4825]: I1007 19:31:08.560820 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c-inventory" (OuterVolumeSpecName: "inventory") pod "da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c" (UID: "da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:31:08 crc kubenswrapper[4825]: I1007 19:31:08.576491 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c" (UID: "da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:31:08 crc kubenswrapper[4825]: I1007 19:31:08.630359 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 19:31:08 crc kubenswrapper[4825]: I1007 19:31:08.630392 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 19:31:08 crc kubenswrapper[4825]: I1007 19:31:08.630404 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4k8c6\" (UniqueName: \"kubernetes.io/projected/da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c-kube-api-access-4k8c6\") on node \"crc\" DevicePath \"\"" Oct 07 19:31:09 crc kubenswrapper[4825]: I1007 19:31:09.018128 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn" event={"ID":"da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c","Type":"ContainerDied","Data":"729e83ef62d3b3f4e4e7c24625544dfc355fde26bccc468fb89346d98685af9a"} Oct 07 19:31:09 crc kubenswrapper[4825]: I1007 19:31:09.018200 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="729e83ef62d3b3f4e4e7c24625544dfc355fde26bccc468fb89346d98685af9a" Oct 07 19:31:09 crc kubenswrapper[4825]: I1007 19:31:09.018212 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn" Oct 07 19:31:16 crc kubenswrapper[4825]: I1007 19:31:16.032963 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z"] Oct 07 19:31:16 crc kubenswrapper[4825]: E1007 19:31:16.034127 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 19:31:16 crc kubenswrapper[4825]: I1007 19:31:16.034152 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 19:31:16 crc kubenswrapper[4825]: I1007 19:31:16.034482 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 19:31:16 crc kubenswrapper[4825]: I1007 19:31:16.035470 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z" Oct 07 19:31:16 crc kubenswrapper[4825]: I1007 19:31:16.041909 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lr8sm" Oct 07 19:31:16 crc kubenswrapper[4825]: I1007 19:31:16.042825 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 19:31:16 crc kubenswrapper[4825]: I1007 19:31:16.043377 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 19:31:16 crc kubenswrapper[4825]: I1007 19:31:16.043536 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 19:31:16 crc kubenswrapper[4825]: I1007 19:31:16.064307 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z"] Oct 07 19:31:16 crc kubenswrapper[4825]: I1007 19:31:16.083169 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26532318-7138-4557-9814-febc4ba75fb8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z\" (UID: \"26532318-7138-4557-9814-febc4ba75fb8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z" Oct 07 19:31:16 crc kubenswrapper[4825]: I1007 19:31:16.083291 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqmjq\" (UniqueName: \"kubernetes.io/projected/26532318-7138-4557-9814-febc4ba75fb8-kube-api-access-lqmjq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z\" (UID: \"26532318-7138-4557-9814-febc4ba75fb8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z" Oct 07 19:31:16 crc kubenswrapper[4825]: I1007 19:31:16.083352 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26532318-7138-4557-9814-febc4ba75fb8-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z\" (UID: \"26532318-7138-4557-9814-febc4ba75fb8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z" Oct 07 19:31:16 crc kubenswrapper[4825]: I1007 19:31:16.186144 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26532318-7138-4557-9814-febc4ba75fb8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z\" (UID: \"26532318-7138-4557-9814-febc4ba75fb8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z" Oct 07 19:31:16 crc kubenswrapper[4825]: I1007 19:31:16.186707 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqmjq\" (UniqueName: \"kubernetes.io/projected/26532318-7138-4557-9814-febc4ba75fb8-kube-api-access-lqmjq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z\" (UID: \"26532318-7138-4557-9814-febc4ba75fb8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z" Oct 07 19:31:16 crc kubenswrapper[4825]: I1007 19:31:16.187020 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26532318-7138-4557-9814-febc4ba75fb8-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z\" (UID: \"26532318-7138-4557-9814-febc4ba75fb8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z" Oct 07 19:31:16 crc kubenswrapper[4825]: I1007 19:31:16.192848 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26532318-7138-4557-9814-febc4ba75fb8-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z\" (UID: \"26532318-7138-4557-9814-febc4ba75fb8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z" Oct 07 19:31:16 crc kubenswrapper[4825]: I1007 19:31:16.201312 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26532318-7138-4557-9814-febc4ba75fb8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z\" (UID: \"26532318-7138-4557-9814-febc4ba75fb8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z" Oct 07 19:31:16 crc kubenswrapper[4825]: I1007 19:31:16.201888 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqmjq\" (UniqueName: \"kubernetes.io/projected/26532318-7138-4557-9814-febc4ba75fb8-kube-api-access-lqmjq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z\" (UID: \"26532318-7138-4557-9814-febc4ba75fb8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z" Oct 07 19:31:16 crc kubenswrapper[4825]: I1007 19:31:16.371214 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z" Oct 07 19:31:16 crc kubenswrapper[4825]: I1007 19:31:16.774259 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z"] Oct 07 19:31:17 crc kubenswrapper[4825]: I1007 19:31:17.101882 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z" event={"ID":"26532318-7138-4557-9814-febc4ba75fb8","Type":"ContainerStarted","Data":"1ed561238f5e4c4043cc150e826e88be124f3e1279dc95991ff3eeadd671739e"} Oct 07 19:31:18 crc kubenswrapper[4825]: I1007 19:31:18.114057 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z" event={"ID":"26532318-7138-4557-9814-febc4ba75fb8","Type":"ContainerStarted","Data":"1142a79eee67b2b7069d1d47f8102aea383664decf648d203a2f3b8e4a4d249b"} Oct 07 19:31:18 crc kubenswrapper[4825]: I1007 19:31:18.136911 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z" podStartSLOduration=1.939897912 podStartE2EDuration="2.13689345s" podCreationTimestamp="2025-10-07 19:31:16 +0000 UTC" firstStartedPulling="2025-10-07 19:31:16.781329228 +0000 UTC m=+1865.603367905" lastFinishedPulling="2025-10-07 19:31:16.978324816 +0000 UTC m=+1865.800363443" observedRunningTime="2025-10-07 19:31:18.134462271 +0000 UTC m=+1866.956500918" watchObservedRunningTime="2025-10-07 19:31:18.13689345 +0000 UTC m=+1866.958932097" Oct 07 19:31:18 crc kubenswrapper[4825]: I1007 19:31:18.796549 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:31:18 crc kubenswrapper[4825]: E1007 19:31:18.797154 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:31:31 crc kubenswrapper[4825]: I1007 19:31:31.801354 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:31:31 crc kubenswrapper[4825]: E1007 19:31:31.802211 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:31:33 crc kubenswrapper[4825]: I1007 19:31:33.050921 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-jr6f5"] Oct 07 19:31:33 crc kubenswrapper[4825]: I1007 19:31:33.061591 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-jr6f5"] Oct 07 19:31:33 crc kubenswrapper[4825]: I1007 19:31:33.810562 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6a27b82-0f43-4b90-8c59-067a68179b64" path="/var/lib/kubelet/pods/d6a27b82-0f43-4b90-8c59-067a68179b64/volumes" Oct 07 19:31:45 crc kubenswrapper[4825]: I1007 19:31:45.795936 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:31:45 crc kubenswrapper[4825]: E1007 19:31:45.796919 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:31:48 crc kubenswrapper[4825]: I1007 19:31:48.794542 4825 scope.go:117] "RemoveContainer" containerID="0600783efcd1976b63b8cddcd172840304f1b75d0058e64c302282e50be96063" Oct 07 19:31:48 crc kubenswrapper[4825]: I1007 19:31:48.846923 4825 scope.go:117] "RemoveContainer" containerID="43deadf38680adb256c4ca4664d6b3186ab3de0e6a2559a1fdad8d7494a8ebdd" Oct 07 19:31:48 crc kubenswrapper[4825]: I1007 19:31:48.926398 4825 scope.go:117] "RemoveContainer" containerID="4e71f56cbdda4edd6fc279091f95eda084e94d2938c7086ddecacab8a2937970" Oct 07 19:31:58 crc kubenswrapper[4825]: I1007 19:31:58.795363 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:31:58 crc kubenswrapper[4825]: E1007 19:31:58.796220 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:32:03 crc kubenswrapper[4825]: I1007 19:32:03.561935 4825 generic.go:334] "Generic (PLEG): container finished" podID="26532318-7138-4557-9814-febc4ba75fb8" containerID="1142a79eee67b2b7069d1d47f8102aea383664decf648d203a2f3b8e4a4d249b" exitCode=0 Oct 07 19:32:03 crc kubenswrapper[4825]: I1007 19:32:03.562080 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z" event={"ID":"26532318-7138-4557-9814-febc4ba75fb8","Type":"ContainerDied","Data":"1142a79eee67b2b7069d1d47f8102aea383664decf648d203a2f3b8e4a4d249b"} Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.088994 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.232366 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26532318-7138-4557-9814-febc4ba75fb8-ssh-key\") pod \"26532318-7138-4557-9814-febc4ba75fb8\" (UID: \"26532318-7138-4557-9814-febc4ba75fb8\") " Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.232801 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26532318-7138-4557-9814-febc4ba75fb8-inventory\") pod \"26532318-7138-4557-9814-febc4ba75fb8\" (UID: \"26532318-7138-4557-9814-febc4ba75fb8\") " Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.232872 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqmjq\" (UniqueName: \"kubernetes.io/projected/26532318-7138-4557-9814-febc4ba75fb8-kube-api-access-lqmjq\") pod \"26532318-7138-4557-9814-febc4ba75fb8\" (UID: \"26532318-7138-4557-9814-febc4ba75fb8\") " Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.237254 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26532318-7138-4557-9814-febc4ba75fb8-kube-api-access-lqmjq" (OuterVolumeSpecName: "kube-api-access-lqmjq") pod "26532318-7138-4557-9814-febc4ba75fb8" (UID: "26532318-7138-4557-9814-febc4ba75fb8"). InnerVolumeSpecName "kube-api-access-lqmjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.257172 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26532318-7138-4557-9814-febc4ba75fb8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "26532318-7138-4557-9814-febc4ba75fb8" (UID: "26532318-7138-4557-9814-febc4ba75fb8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.263957 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26532318-7138-4557-9814-febc4ba75fb8-inventory" (OuterVolumeSpecName: "inventory") pod "26532318-7138-4557-9814-febc4ba75fb8" (UID: "26532318-7138-4557-9814-febc4ba75fb8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.336306 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26532318-7138-4557-9814-febc4ba75fb8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.336369 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26532318-7138-4557-9814-febc4ba75fb8-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.336397 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqmjq\" (UniqueName: \"kubernetes.io/projected/26532318-7138-4557-9814-febc4ba75fb8-kube-api-access-lqmjq\") on node \"crc\" DevicePath \"\"" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.587467 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z" event={"ID":"26532318-7138-4557-9814-febc4ba75fb8","Type":"ContainerDied","Data":"1ed561238f5e4c4043cc150e826e88be124f3e1279dc95991ff3eeadd671739e"} Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.587530 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ed561238f5e4c4043cc150e826e88be124f3e1279dc95991ff3eeadd671739e" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.587558 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.698729 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-d55wk"] Oct 07 19:32:05 crc kubenswrapper[4825]: E1007 19:32:05.699136 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26532318-7138-4557-9814-febc4ba75fb8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.699153 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="26532318-7138-4557-9814-febc4ba75fb8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.699525 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="26532318-7138-4557-9814-febc4ba75fb8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.701268 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-d55wk" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.704458 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lr8sm" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.704627 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.704689 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.704751 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.718026 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-d55wk"] Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.854265 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63e9c706-689c-43be-a9a3-67f20fbfea88-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-d55wk\" (UID: \"63e9c706-689c-43be-a9a3-67f20fbfea88\") " pod="openstack/ssh-known-hosts-edpm-deployment-d55wk" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.854405 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcch8\" (UniqueName: \"kubernetes.io/projected/63e9c706-689c-43be-a9a3-67f20fbfea88-kube-api-access-pcch8\") pod \"ssh-known-hosts-edpm-deployment-d55wk\" (UID: \"63e9c706-689c-43be-a9a3-67f20fbfea88\") " pod="openstack/ssh-known-hosts-edpm-deployment-d55wk" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.854443 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/63e9c706-689c-43be-a9a3-67f20fbfea88-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-d55wk\" (UID: \"63e9c706-689c-43be-a9a3-67f20fbfea88\") " pod="openstack/ssh-known-hosts-edpm-deployment-d55wk" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.955687 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63e9c706-689c-43be-a9a3-67f20fbfea88-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-d55wk\" (UID: \"63e9c706-689c-43be-a9a3-67f20fbfea88\") " pod="openstack/ssh-known-hosts-edpm-deployment-d55wk" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.955824 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcch8\" (UniqueName: \"kubernetes.io/projected/63e9c706-689c-43be-a9a3-67f20fbfea88-kube-api-access-pcch8\") pod \"ssh-known-hosts-edpm-deployment-d55wk\" (UID: \"63e9c706-689c-43be-a9a3-67f20fbfea88\") " pod="openstack/ssh-known-hosts-edpm-deployment-d55wk" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.955859 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/63e9c706-689c-43be-a9a3-67f20fbfea88-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-d55wk\" (UID: \"63e9c706-689c-43be-a9a3-67f20fbfea88\") " pod="openstack/ssh-known-hosts-edpm-deployment-d55wk" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.959310 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63e9c706-689c-43be-a9a3-67f20fbfea88-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-d55wk\" (UID: \"63e9c706-689c-43be-a9a3-67f20fbfea88\") " pod="openstack/ssh-known-hosts-edpm-deployment-d55wk" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.959368 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/63e9c706-689c-43be-a9a3-67f20fbfea88-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-d55wk\" (UID: \"63e9c706-689c-43be-a9a3-67f20fbfea88\") " pod="openstack/ssh-known-hosts-edpm-deployment-d55wk" Oct 07 19:32:05 crc kubenswrapper[4825]: I1007 19:32:05.971682 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcch8\" (UniqueName: \"kubernetes.io/projected/63e9c706-689c-43be-a9a3-67f20fbfea88-kube-api-access-pcch8\") pod \"ssh-known-hosts-edpm-deployment-d55wk\" (UID: \"63e9c706-689c-43be-a9a3-67f20fbfea88\") " pod="openstack/ssh-known-hosts-edpm-deployment-d55wk" Oct 07 19:32:06 crc kubenswrapper[4825]: I1007 19:32:06.033374 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-d55wk" Oct 07 19:32:06 crc kubenswrapper[4825]: I1007 19:32:06.395295 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-d55wk"] Oct 07 19:32:06 crc kubenswrapper[4825]: I1007 19:32:06.602364 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-d55wk" event={"ID":"63e9c706-689c-43be-a9a3-67f20fbfea88","Type":"ContainerStarted","Data":"fb187c084ab046160aa2020698d5d41692e79669a9d6f492c49fc1ab9d1d1bca"} Oct 07 19:32:07 crc kubenswrapper[4825]: I1007 19:32:07.611452 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-d55wk" event={"ID":"63e9c706-689c-43be-a9a3-67f20fbfea88","Type":"ContainerStarted","Data":"cf64fee43f367abdddae2250005cd3ff421b4c6eb4ea16fb10fe77f16675e3a0"} Oct 07 19:32:07 crc kubenswrapper[4825]: I1007 19:32:07.629654 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-d55wk" podStartSLOduration=2.475464762 podStartE2EDuration="2.629627163s" podCreationTimestamp="2025-10-07 19:32:05 +0000 UTC" firstStartedPulling="2025-10-07 19:32:06.396804982 +0000 UTC m=+1915.218843639" lastFinishedPulling="2025-10-07 19:32:06.550967393 +0000 UTC m=+1915.373006040" observedRunningTime="2025-10-07 19:32:07.629220787 +0000 UTC m=+1916.451259454" watchObservedRunningTime="2025-10-07 19:32:07.629627163 +0000 UTC m=+1916.451665840" Oct 07 19:32:13 crc kubenswrapper[4825]: I1007 19:32:13.694583 4825 generic.go:334] "Generic (PLEG): container finished" podID="63e9c706-689c-43be-a9a3-67f20fbfea88" containerID="cf64fee43f367abdddae2250005cd3ff421b4c6eb4ea16fb10fe77f16675e3a0" exitCode=0 Oct 07 19:32:13 crc kubenswrapper[4825]: I1007 19:32:13.694713 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-d55wk" event={"ID":"63e9c706-689c-43be-a9a3-67f20fbfea88","Type":"ContainerDied","Data":"cf64fee43f367abdddae2250005cd3ff421b4c6eb4ea16fb10fe77f16675e3a0"} Oct 07 19:32:13 crc kubenswrapper[4825]: I1007 19:32:13.795761 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:32:13 crc kubenswrapper[4825]: E1007 19:32:13.796217 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.221352 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-d55wk" Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.349326 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63e9c706-689c-43be-a9a3-67f20fbfea88-ssh-key-openstack-edpm-ipam\") pod \"63e9c706-689c-43be-a9a3-67f20fbfea88\" (UID: \"63e9c706-689c-43be-a9a3-67f20fbfea88\") " Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.349694 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcch8\" (UniqueName: \"kubernetes.io/projected/63e9c706-689c-43be-a9a3-67f20fbfea88-kube-api-access-pcch8\") pod \"63e9c706-689c-43be-a9a3-67f20fbfea88\" (UID: \"63e9c706-689c-43be-a9a3-67f20fbfea88\") " Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.349728 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/63e9c706-689c-43be-a9a3-67f20fbfea88-inventory-0\") pod \"63e9c706-689c-43be-a9a3-67f20fbfea88\" (UID: \"63e9c706-689c-43be-a9a3-67f20fbfea88\") " Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.356127 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63e9c706-689c-43be-a9a3-67f20fbfea88-kube-api-access-pcch8" (OuterVolumeSpecName: "kube-api-access-pcch8") pod "63e9c706-689c-43be-a9a3-67f20fbfea88" (UID: "63e9c706-689c-43be-a9a3-67f20fbfea88"). InnerVolumeSpecName "kube-api-access-pcch8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.380103 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63e9c706-689c-43be-a9a3-67f20fbfea88-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "63e9c706-689c-43be-a9a3-67f20fbfea88" (UID: "63e9c706-689c-43be-a9a3-67f20fbfea88"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.404037 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63e9c706-689c-43be-a9a3-67f20fbfea88-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "63e9c706-689c-43be-a9a3-67f20fbfea88" (UID: "63e9c706-689c-43be-a9a3-67f20fbfea88"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.453442 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63e9c706-689c-43be-a9a3-67f20fbfea88-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.454004 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcch8\" (UniqueName: \"kubernetes.io/projected/63e9c706-689c-43be-a9a3-67f20fbfea88-kube-api-access-pcch8\") on node \"crc\" DevicePath \"\"" Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.454028 4825 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/63e9c706-689c-43be-a9a3-67f20fbfea88-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.713929 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-d55wk" event={"ID":"63e9c706-689c-43be-a9a3-67f20fbfea88","Type":"ContainerDied","Data":"fb187c084ab046160aa2020698d5d41692e79669a9d6f492c49fc1ab9d1d1bca"} Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.713974 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-d55wk" Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.713983 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb187c084ab046160aa2020698d5d41692e79669a9d6f492c49fc1ab9d1d1bca" Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.810511 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-8kr2p"] Oct 07 19:32:15 crc kubenswrapper[4825]: E1007 19:32:15.810783 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63e9c706-689c-43be-a9a3-67f20fbfea88" containerName="ssh-known-hosts-edpm-deployment" Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.810795 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e9c706-689c-43be-a9a3-67f20fbfea88" containerName="ssh-known-hosts-edpm-deployment" Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.810996 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="63e9c706-689c-43be-a9a3-67f20fbfea88" containerName="ssh-known-hosts-edpm-deployment" Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.811576 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8kr2p" Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.816115 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.816121 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.817336 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.817797 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lr8sm" Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.830669 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-8kr2p"] Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.965273 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84pb4\" (UniqueName: \"kubernetes.io/projected/fdf4da6b-b218-4ab1-87c6-7b8cfcef6810-kube-api-access-84pb4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8kr2p\" (UID: \"fdf4da6b-b218-4ab1-87c6-7b8cfcef6810\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8kr2p" Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.965484 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fdf4da6b-b218-4ab1-87c6-7b8cfcef6810-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8kr2p\" (UID: \"fdf4da6b-b218-4ab1-87c6-7b8cfcef6810\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8kr2p" Oct 07 19:32:15 crc kubenswrapper[4825]: I1007 19:32:15.965595 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fdf4da6b-b218-4ab1-87c6-7b8cfcef6810-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8kr2p\" (UID: \"fdf4da6b-b218-4ab1-87c6-7b8cfcef6810\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8kr2p" Oct 07 19:32:16 crc kubenswrapper[4825]: I1007 19:32:16.067623 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84pb4\" (UniqueName: \"kubernetes.io/projected/fdf4da6b-b218-4ab1-87c6-7b8cfcef6810-kube-api-access-84pb4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8kr2p\" (UID: \"fdf4da6b-b218-4ab1-87c6-7b8cfcef6810\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8kr2p" Oct 07 19:32:16 crc kubenswrapper[4825]: I1007 19:32:16.067796 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fdf4da6b-b218-4ab1-87c6-7b8cfcef6810-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8kr2p\" (UID: \"fdf4da6b-b218-4ab1-87c6-7b8cfcef6810\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8kr2p" Oct 07 19:32:16 crc kubenswrapper[4825]: I1007 19:32:16.067874 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fdf4da6b-b218-4ab1-87c6-7b8cfcef6810-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8kr2p\" (UID: \"fdf4da6b-b218-4ab1-87c6-7b8cfcef6810\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8kr2p" Oct 07 19:32:16 crc kubenswrapper[4825]: I1007 19:32:16.073989 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fdf4da6b-b218-4ab1-87c6-7b8cfcef6810-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8kr2p\" (UID: \"fdf4da6b-b218-4ab1-87c6-7b8cfcef6810\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8kr2p" Oct 07 19:32:16 crc kubenswrapper[4825]: I1007 19:32:16.075196 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fdf4da6b-b218-4ab1-87c6-7b8cfcef6810-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8kr2p\" (UID: \"fdf4da6b-b218-4ab1-87c6-7b8cfcef6810\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8kr2p" Oct 07 19:32:16 crc kubenswrapper[4825]: I1007 19:32:16.098960 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84pb4\" (UniqueName: \"kubernetes.io/projected/fdf4da6b-b218-4ab1-87c6-7b8cfcef6810-kube-api-access-84pb4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8kr2p\" (UID: \"fdf4da6b-b218-4ab1-87c6-7b8cfcef6810\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8kr2p" Oct 07 19:32:16 crc kubenswrapper[4825]: I1007 19:32:16.144425 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8kr2p" Oct 07 19:32:16 crc kubenswrapper[4825]: W1007 19:32:16.561709 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdf4da6b_b218_4ab1_87c6_7b8cfcef6810.slice/crio-606234a102caf3828988a59db37a8572ae545f84a0c30e5f75e41c1a13450836 WatchSource:0}: Error finding container 606234a102caf3828988a59db37a8572ae545f84a0c30e5f75e41c1a13450836: Status 404 returned error can't find the container with id 606234a102caf3828988a59db37a8572ae545f84a0c30e5f75e41c1a13450836 Oct 07 19:32:16 crc kubenswrapper[4825]: I1007 19:32:16.563582 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-8kr2p"] Oct 07 19:32:16 crc kubenswrapper[4825]: I1007 19:32:16.729189 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8kr2p" event={"ID":"fdf4da6b-b218-4ab1-87c6-7b8cfcef6810","Type":"ContainerStarted","Data":"606234a102caf3828988a59db37a8572ae545f84a0c30e5f75e41c1a13450836"} Oct 07 19:32:17 crc kubenswrapper[4825]: I1007 19:32:17.739680 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8kr2p" event={"ID":"fdf4da6b-b218-4ab1-87c6-7b8cfcef6810","Type":"ContainerStarted","Data":"eb16c1baad73eed1ccf1a06e0559a07ad8f74430d643a173ffcd98a3f832b95b"} Oct 07 19:32:17 crc kubenswrapper[4825]: I1007 19:32:17.767031 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8kr2p" podStartSLOduration=2.603082387 podStartE2EDuration="2.767012013s" podCreationTimestamp="2025-10-07 19:32:15 +0000 UTC" firstStartedPulling="2025-10-07 19:32:16.565278441 +0000 UTC m=+1925.387317078" lastFinishedPulling="2025-10-07 19:32:16.729208017 +0000 UTC m=+1925.551246704" observedRunningTime="2025-10-07 19:32:17.76375754 +0000 UTC m=+1926.585796187" watchObservedRunningTime="2025-10-07 19:32:17.767012013 +0000 UTC m=+1926.589050650" Oct 07 19:32:24 crc kubenswrapper[4825]: I1007 19:32:24.797016 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:32:24 crc kubenswrapper[4825]: E1007 19:32:24.798484 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:32:24 crc kubenswrapper[4825]: I1007 19:32:24.836867 4825 generic.go:334] "Generic (PLEG): container finished" podID="fdf4da6b-b218-4ab1-87c6-7b8cfcef6810" containerID="eb16c1baad73eed1ccf1a06e0559a07ad8f74430d643a173ffcd98a3f832b95b" exitCode=0 Oct 07 19:32:24 crc kubenswrapper[4825]: I1007 19:32:24.836945 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8kr2p" event={"ID":"fdf4da6b-b218-4ab1-87c6-7b8cfcef6810","Type":"ContainerDied","Data":"eb16c1baad73eed1ccf1a06e0559a07ad8f74430d643a173ffcd98a3f832b95b"} Oct 07 19:32:26 crc kubenswrapper[4825]: I1007 19:32:26.333033 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8kr2p" Oct 07 19:32:26 crc kubenswrapper[4825]: I1007 19:32:26.489073 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fdf4da6b-b218-4ab1-87c6-7b8cfcef6810-ssh-key\") pod \"fdf4da6b-b218-4ab1-87c6-7b8cfcef6810\" (UID: \"fdf4da6b-b218-4ab1-87c6-7b8cfcef6810\") " Oct 07 19:32:26 crc kubenswrapper[4825]: I1007 19:32:26.489156 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84pb4\" (UniqueName: \"kubernetes.io/projected/fdf4da6b-b218-4ab1-87c6-7b8cfcef6810-kube-api-access-84pb4\") pod \"fdf4da6b-b218-4ab1-87c6-7b8cfcef6810\" (UID: \"fdf4da6b-b218-4ab1-87c6-7b8cfcef6810\") " Oct 07 19:32:26 crc kubenswrapper[4825]: I1007 19:32:26.489191 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fdf4da6b-b218-4ab1-87c6-7b8cfcef6810-inventory\") pod \"fdf4da6b-b218-4ab1-87c6-7b8cfcef6810\" (UID: \"fdf4da6b-b218-4ab1-87c6-7b8cfcef6810\") " Oct 07 19:32:26 crc kubenswrapper[4825]: I1007 19:32:26.496792 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdf4da6b-b218-4ab1-87c6-7b8cfcef6810-kube-api-access-84pb4" (OuterVolumeSpecName: "kube-api-access-84pb4") pod "fdf4da6b-b218-4ab1-87c6-7b8cfcef6810" (UID: "fdf4da6b-b218-4ab1-87c6-7b8cfcef6810"). InnerVolumeSpecName "kube-api-access-84pb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:32:26 crc kubenswrapper[4825]: I1007 19:32:26.521489 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdf4da6b-b218-4ab1-87c6-7b8cfcef6810-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fdf4da6b-b218-4ab1-87c6-7b8cfcef6810" (UID: "fdf4da6b-b218-4ab1-87c6-7b8cfcef6810"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:32:26 crc kubenswrapper[4825]: I1007 19:32:26.524567 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdf4da6b-b218-4ab1-87c6-7b8cfcef6810-inventory" (OuterVolumeSpecName: "inventory") pod "fdf4da6b-b218-4ab1-87c6-7b8cfcef6810" (UID: "fdf4da6b-b218-4ab1-87c6-7b8cfcef6810"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:32:26 crc kubenswrapper[4825]: I1007 19:32:26.590734 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fdf4da6b-b218-4ab1-87c6-7b8cfcef6810-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 19:32:26 crc kubenswrapper[4825]: I1007 19:32:26.590772 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fdf4da6b-b218-4ab1-87c6-7b8cfcef6810-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 19:32:26 crc kubenswrapper[4825]: I1007 19:32:26.590787 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84pb4\" (UniqueName: \"kubernetes.io/projected/fdf4da6b-b218-4ab1-87c6-7b8cfcef6810-kube-api-access-84pb4\") on node \"crc\" DevicePath \"\"" Oct 07 19:32:26 crc kubenswrapper[4825]: I1007 19:32:26.882643 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8kr2p" event={"ID":"fdf4da6b-b218-4ab1-87c6-7b8cfcef6810","Type":"ContainerDied","Data":"606234a102caf3828988a59db37a8572ae545f84a0c30e5f75e41c1a13450836"} Oct 07 19:32:26 crc kubenswrapper[4825]: I1007 19:32:26.882705 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="606234a102caf3828988a59db37a8572ae545f84a0c30e5f75e41c1a13450836" Oct 07 19:32:26 crc kubenswrapper[4825]: I1007 19:32:26.882779 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8kr2p" Oct 07 19:32:26 crc kubenswrapper[4825]: I1007 19:32:26.971201 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn"] Oct 07 19:32:26 crc kubenswrapper[4825]: E1007 19:32:26.971696 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdf4da6b-b218-4ab1-87c6-7b8cfcef6810" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 07 19:32:26 crc kubenswrapper[4825]: I1007 19:32:26.971715 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdf4da6b-b218-4ab1-87c6-7b8cfcef6810" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 07 19:32:26 crc kubenswrapper[4825]: I1007 19:32:26.971899 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdf4da6b-b218-4ab1-87c6-7b8cfcef6810" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 07 19:32:26 crc kubenswrapper[4825]: I1007 19:32:26.972607 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn" Oct 07 19:32:26 crc kubenswrapper[4825]: I1007 19:32:26.975629 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 19:32:26 crc kubenswrapper[4825]: I1007 19:32:26.975890 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 19:32:26 crc kubenswrapper[4825]: I1007 19:32:26.975959 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lr8sm" Oct 07 19:32:26 crc kubenswrapper[4825]: I1007 19:32:26.976785 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 19:32:26 crc kubenswrapper[4825]: I1007 19:32:26.988325 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn"] Oct 07 19:32:27 crc kubenswrapper[4825]: I1007 19:32:27.100614 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpmsn\" (UniqueName: \"kubernetes.io/projected/5cd31618-4e62-438f-b168-1d322052785d-kube-api-access-vpmsn\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn\" (UID: \"5cd31618-4e62-438f-b168-1d322052785d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn" Oct 07 19:32:27 crc kubenswrapper[4825]: I1007 19:32:27.101021 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5cd31618-4e62-438f-b168-1d322052785d-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn\" (UID: \"5cd31618-4e62-438f-b168-1d322052785d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn" Oct 07 19:32:27 crc kubenswrapper[4825]: I1007 19:32:27.101203 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5cd31618-4e62-438f-b168-1d322052785d-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn\" (UID: \"5cd31618-4e62-438f-b168-1d322052785d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn" Oct 07 19:32:27 crc kubenswrapper[4825]: I1007 19:32:27.203818 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpmsn\" (UniqueName: \"kubernetes.io/projected/5cd31618-4e62-438f-b168-1d322052785d-kube-api-access-vpmsn\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn\" (UID: \"5cd31618-4e62-438f-b168-1d322052785d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn" Oct 07 19:32:27 crc kubenswrapper[4825]: I1007 19:32:27.203907 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5cd31618-4e62-438f-b168-1d322052785d-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn\" (UID: \"5cd31618-4e62-438f-b168-1d322052785d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn" Oct 07 19:32:27 crc kubenswrapper[4825]: I1007 19:32:27.203952 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5cd31618-4e62-438f-b168-1d322052785d-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn\" (UID: \"5cd31618-4e62-438f-b168-1d322052785d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn" Oct 07 19:32:27 crc kubenswrapper[4825]: I1007 19:32:27.209983 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5cd31618-4e62-438f-b168-1d322052785d-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn\" (UID: \"5cd31618-4e62-438f-b168-1d322052785d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn" Oct 07 19:32:27 crc kubenswrapper[4825]: I1007 19:32:27.210081 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5cd31618-4e62-438f-b168-1d322052785d-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn\" (UID: \"5cd31618-4e62-438f-b168-1d322052785d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn" Oct 07 19:32:27 crc kubenswrapper[4825]: I1007 19:32:27.234867 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpmsn\" (UniqueName: \"kubernetes.io/projected/5cd31618-4e62-438f-b168-1d322052785d-kube-api-access-vpmsn\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn\" (UID: \"5cd31618-4e62-438f-b168-1d322052785d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn" Oct 07 19:32:27 crc kubenswrapper[4825]: I1007 19:32:27.293624 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn" Oct 07 19:32:27 crc kubenswrapper[4825]: I1007 19:32:27.835206 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn"] Oct 07 19:32:27 crc kubenswrapper[4825]: W1007 19:32:27.840706 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cd31618_4e62_438f_b168_1d322052785d.slice/crio-2164bc02dc378ae06b48042f022e24b1b2a97366a41ab97cd989778532b3c02f WatchSource:0}: Error finding container 2164bc02dc378ae06b48042f022e24b1b2a97366a41ab97cd989778532b3c02f: Status 404 returned error can't find the container with id 2164bc02dc378ae06b48042f022e24b1b2a97366a41ab97cd989778532b3c02f Oct 07 19:32:27 crc kubenswrapper[4825]: I1007 19:32:27.892802 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn" event={"ID":"5cd31618-4e62-438f-b168-1d322052785d","Type":"ContainerStarted","Data":"2164bc02dc378ae06b48042f022e24b1b2a97366a41ab97cd989778532b3c02f"} Oct 07 19:32:28 crc kubenswrapper[4825]: I1007 19:32:28.906413 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn" event={"ID":"5cd31618-4e62-438f-b168-1d322052785d","Type":"ContainerStarted","Data":"07b1dbcbe4366a9fa99ef3cec3bb401cff86aa0efc5bc9862bce9f482bbb519f"} Oct 07 19:32:28 crc kubenswrapper[4825]: I1007 19:32:28.934048 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn" podStartSLOduration=2.758353623 podStartE2EDuration="2.934026978s" podCreationTimestamp="2025-10-07 19:32:26 +0000 UTC" firstStartedPulling="2025-10-07 19:32:27.843483523 +0000 UTC m=+1936.665522160" lastFinishedPulling="2025-10-07 19:32:28.019156878 +0000 UTC m=+1936.841195515" observedRunningTime="2025-10-07 19:32:28.925303877 +0000 UTC m=+1937.747342524" watchObservedRunningTime="2025-10-07 19:32:28.934026978 +0000 UTC m=+1937.756065625" Oct 07 19:32:36 crc kubenswrapper[4825]: I1007 19:32:36.795407 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:32:38 crc kubenswrapper[4825]: I1007 19:32:38.005155 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerStarted","Data":"7dc751bc7deb95ed4969acee0ba339cabe0592b5e0342dcfba004125f3c9f015"} Oct 07 19:32:38 crc kubenswrapper[4825]: I1007 19:32:38.007443 4825 generic.go:334] "Generic (PLEG): container finished" podID="5cd31618-4e62-438f-b168-1d322052785d" containerID="07b1dbcbe4366a9fa99ef3cec3bb401cff86aa0efc5bc9862bce9f482bbb519f" exitCode=0 Oct 07 19:32:38 crc kubenswrapper[4825]: I1007 19:32:38.007485 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn" event={"ID":"5cd31618-4e62-438f-b168-1d322052785d","Type":"ContainerDied","Data":"07b1dbcbe4366a9fa99ef3cec3bb401cff86aa0efc5bc9862bce9f482bbb519f"} Oct 07 19:32:39 crc kubenswrapper[4825]: I1007 19:32:39.547368 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn" Oct 07 19:32:39 crc kubenswrapper[4825]: I1007 19:32:39.705833 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5cd31618-4e62-438f-b168-1d322052785d-inventory\") pod \"5cd31618-4e62-438f-b168-1d322052785d\" (UID: \"5cd31618-4e62-438f-b168-1d322052785d\") " Oct 07 19:32:39 crc kubenswrapper[4825]: I1007 19:32:39.706198 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5cd31618-4e62-438f-b168-1d322052785d-ssh-key\") pod \"5cd31618-4e62-438f-b168-1d322052785d\" (UID: \"5cd31618-4e62-438f-b168-1d322052785d\") " Oct 07 19:32:39 crc kubenswrapper[4825]: I1007 19:32:39.706398 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpmsn\" (UniqueName: \"kubernetes.io/projected/5cd31618-4e62-438f-b168-1d322052785d-kube-api-access-vpmsn\") pod \"5cd31618-4e62-438f-b168-1d322052785d\" (UID: \"5cd31618-4e62-438f-b168-1d322052785d\") " Oct 07 19:32:39 crc kubenswrapper[4825]: I1007 19:32:39.717443 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cd31618-4e62-438f-b168-1d322052785d-kube-api-access-vpmsn" (OuterVolumeSpecName: "kube-api-access-vpmsn") pod "5cd31618-4e62-438f-b168-1d322052785d" (UID: "5cd31618-4e62-438f-b168-1d322052785d"). InnerVolumeSpecName "kube-api-access-vpmsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:32:39 crc kubenswrapper[4825]: I1007 19:32:39.736405 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cd31618-4e62-438f-b168-1d322052785d-inventory" (OuterVolumeSpecName: "inventory") pod "5cd31618-4e62-438f-b168-1d322052785d" (UID: "5cd31618-4e62-438f-b168-1d322052785d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:32:39 crc kubenswrapper[4825]: I1007 19:32:39.757531 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cd31618-4e62-438f-b168-1d322052785d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5cd31618-4e62-438f-b168-1d322052785d" (UID: "5cd31618-4e62-438f-b168-1d322052785d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:32:39 crc kubenswrapper[4825]: I1007 19:32:39.809089 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5cd31618-4e62-438f-b168-1d322052785d-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 19:32:39 crc kubenswrapper[4825]: I1007 19:32:39.809124 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5cd31618-4e62-438f-b168-1d322052785d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 19:32:39 crc kubenswrapper[4825]: I1007 19:32:39.809134 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpmsn\" (UniqueName: \"kubernetes.io/projected/5cd31618-4e62-438f-b168-1d322052785d-kube-api-access-vpmsn\") on node \"crc\" DevicePath \"\"" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.038901 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn" event={"ID":"5cd31618-4e62-438f-b168-1d322052785d","Type":"ContainerDied","Data":"2164bc02dc378ae06b48042f022e24b1b2a97366a41ab97cd989778532b3c02f"} Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.038978 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2164bc02dc378ae06b48042f022e24b1b2a97366a41ab97cd989778532b3c02f" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.039047 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.192079 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr"] Oct 07 19:32:40 crc kubenswrapper[4825]: E1007 19:32:40.192894 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cd31618-4e62-438f-b168-1d322052785d" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.192928 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cd31618-4e62-438f-b168-1d322052785d" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.193355 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cd31618-4e62-438f-b168-1d322052785d" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.194684 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.198036 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.198376 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.198547 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.198775 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.198954 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.199296 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.199518 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.199649 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lr8sm" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.210432 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr"] Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.317454 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.317516 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.317552 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.317588 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.317641 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.317711 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.317757 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.317805 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.317834 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhhr2\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-kube-api-access-qhhr2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.317986 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.318110 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.318143 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.318303 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.318338 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.420188 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.420343 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.420406 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.420452 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.420496 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.420547 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.420596 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.420655 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.420706 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.421622 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.422434 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhhr2\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-kube-api-access-qhhr2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.422497 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.422647 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.422690 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.426865 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.427398 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.427416 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.427624 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.428056 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.428081 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.428723 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.429781 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.429813 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.430670 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.434018 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.434143 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.434555 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.450356 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhhr2\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-kube-api-access-qhhr2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.525517 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:32:40 crc kubenswrapper[4825]: I1007 19:32:40.940201 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr"] Oct 07 19:32:40 crc kubenswrapper[4825]: W1007 19:32:40.963553 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3a03ee0_54d8_44d6_94fb_59a5bbed04fd.slice/crio-f7a3ed300b8975e505eb2d275c1bc294a9f17570d53f25d0957e3af1703539e0 WatchSource:0}: Error finding container f7a3ed300b8975e505eb2d275c1bc294a9f17570d53f25d0957e3af1703539e0: Status 404 returned error can't find the container with id f7a3ed300b8975e505eb2d275c1bc294a9f17570d53f25d0957e3af1703539e0 Oct 07 19:32:41 crc kubenswrapper[4825]: I1007 19:32:41.049695 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" event={"ID":"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd","Type":"ContainerStarted","Data":"f7a3ed300b8975e505eb2d275c1bc294a9f17570d53f25d0957e3af1703539e0"} Oct 07 19:32:42 crc kubenswrapper[4825]: I1007 19:32:42.065975 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" event={"ID":"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd","Type":"ContainerStarted","Data":"232f16f2a63700103826f537b18cbd092ae2d33df9c4075f57c705b03ec604f3"} Oct 07 19:32:42 crc kubenswrapper[4825]: I1007 19:32:42.103847 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" podStartSLOduration=1.8877569840000001 podStartE2EDuration="2.103822489s" podCreationTimestamp="2025-10-07 19:32:40 +0000 UTC" firstStartedPulling="2025-10-07 19:32:40.965109965 +0000 UTC m=+1949.787148592" lastFinishedPulling="2025-10-07 19:32:41.18117545 +0000 UTC m=+1950.003214097" observedRunningTime="2025-10-07 19:32:42.091648368 +0000 UTC m=+1950.913687025" watchObservedRunningTime="2025-10-07 19:32:42.103822489 +0000 UTC m=+1950.925861156" Oct 07 19:33:20 crc kubenswrapper[4825]: I1007 19:33:20.498877 4825 generic.go:334] "Generic (PLEG): container finished" podID="e3a03ee0-54d8-44d6-94fb-59a5bbed04fd" containerID="232f16f2a63700103826f537b18cbd092ae2d33df9c4075f57c705b03ec604f3" exitCode=0 Oct 07 19:33:20 crc kubenswrapper[4825]: I1007 19:33:20.499588 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" event={"ID":"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd","Type":"ContainerDied","Data":"232f16f2a63700103826f537b18cbd092ae2d33df9c4075f57c705b03ec604f3"} Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.021503 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.150116 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-libvirt-combined-ca-bundle\") pod \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.150214 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhhr2\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-kube-api-access-qhhr2\") pod \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.150305 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.150394 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.150444 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-inventory\") pod \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.150482 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.150545 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-repo-setup-combined-ca-bundle\") pod \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.150605 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-neutron-metadata-combined-ca-bundle\") pod \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.150674 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-ovn-combined-ca-bundle\") pod \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.150774 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.150822 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-telemetry-combined-ca-bundle\") pod \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.150868 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-bootstrap-combined-ca-bundle\") pod \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.151021 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-nova-combined-ca-bundle\") pod \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.151132 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-ssh-key\") pod \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\" (UID: \"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd\") " Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.158116 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd" (UID: "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.158145 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd" (UID: "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.158282 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd" (UID: "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.158748 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd" (UID: "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.160143 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd" (UID: "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.160756 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd" (UID: "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.160980 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd" (UID: "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.162471 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd" (UID: "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.162525 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-kube-api-access-qhhr2" (OuterVolumeSpecName: "kube-api-access-qhhr2") pod "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd" (UID: "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd"). InnerVolumeSpecName "kube-api-access-qhhr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.162933 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd" (UID: "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.163069 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd" (UID: "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.163542 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd" (UID: "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.191660 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-inventory" (OuterVolumeSpecName: "inventory") pod "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd" (UID: "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.198500 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd" (UID: "e3a03ee0-54d8-44d6-94fb-59a5bbed04fd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.254560 4825 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.254606 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.254628 4825 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.254648 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhhr2\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-kube-api-access-qhhr2\") on node \"crc\" DevicePath \"\"" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.254669 4825 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.254695 4825 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.254717 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.254737 4825 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.254758 4825 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.254783 4825 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.254807 4825 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.254841 4825 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.254861 4825 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.254880 4825 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a03ee0-54d8-44d6-94fb-59a5bbed04fd-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.527875 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" event={"ID":"e3a03ee0-54d8-44d6-94fb-59a5bbed04fd","Type":"ContainerDied","Data":"f7a3ed300b8975e505eb2d275c1bc294a9f17570d53f25d0957e3af1703539e0"} Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.527923 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.527944 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7a3ed300b8975e505eb2d275c1bc294a9f17570d53f25d0957e3af1703539e0" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.660816 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz"] Oct 07 19:33:22 crc kubenswrapper[4825]: E1007 19:33:22.661594 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a03ee0-54d8-44d6-94fb-59a5bbed04fd" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.661750 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a03ee0-54d8-44d6-94fb-59a5bbed04fd" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.662151 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3a03ee0-54d8-44d6-94fb-59a5bbed04fd" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.663216 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.673254 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lr8sm" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.674030 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.674342 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.675131 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.678144 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.697049 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz"] Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.773046 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxqvz\" (UID: \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.773119 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxqvz\" (UID: \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.773175 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxqvz\" (UID: \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.773199 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdhwn\" (UniqueName: \"kubernetes.io/projected/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-kube-api-access-mdhwn\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxqvz\" (UID: \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.773219 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxqvz\" (UID: \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.874356 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxqvz\" (UID: \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.874418 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxqvz\" (UID: \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.874492 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxqvz\" (UID: \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.874515 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdhwn\" (UniqueName: \"kubernetes.io/projected/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-kube-api-access-mdhwn\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxqvz\" (UID: \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.874532 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxqvz\" (UID: \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.875603 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxqvz\" (UID: \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.878469 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxqvz\" (UID: \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.878649 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxqvz\" (UID: \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.878740 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxqvz\" (UID: \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz" Oct 07 19:33:22 crc kubenswrapper[4825]: I1007 19:33:22.890451 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdhwn\" (UniqueName: \"kubernetes.io/projected/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-kube-api-access-mdhwn\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxqvz\" (UID: \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz" Oct 07 19:33:23 crc kubenswrapper[4825]: I1007 19:33:23.012555 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz" Oct 07 19:33:23 crc kubenswrapper[4825]: I1007 19:33:23.617176 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 19:33:23 crc kubenswrapper[4825]: I1007 19:33:23.627363 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz"] Oct 07 19:33:24 crc kubenswrapper[4825]: I1007 19:33:24.558194 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz" event={"ID":"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45","Type":"ContainerStarted","Data":"1703b7c8c88d06bdc606a9dc7b0081b07b3220669681c2cf7d05f56b56195d38"} Oct 07 19:33:24 crc kubenswrapper[4825]: I1007 19:33:24.558612 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz" event={"ID":"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45","Type":"ContainerStarted","Data":"717c376dbf15cde58349880ecbeb1e63776e4373e41402c8c08446c6bb3400b1"} Oct 07 19:33:24 crc kubenswrapper[4825]: I1007 19:33:24.594199 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz" podStartSLOduration=2.408089423 podStartE2EDuration="2.594178603s" podCreationTimestamp="2025-10-07 19:33:22 +0000 UTC" firstStartedPulling="2025-10-07 19:33:23.616629376 +0000 UTC m=+1992.438668053" lastFinishedPulling="2025-10-07 19:33:23.802718596 +0000 UTC m=+1992.624757233" observedRunningTime="2025-10-07 19:33:24.594156432 +0000 UTC m=+1993.416195149" watchObservedRunningTime="2025-10-07 19:33:24.594178603 +0000 UTC m=+1993.416217250" Oct 07 19:34:27 crc kubenswrapper[4825]: I1007 19:34:27.075475 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zvlf4"] Oct 07 19:34:27 crc kubenswrapper[4825]: I1007 19:34:27.077910 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zvlf4" Oct 07 19:34:27 crc kubenswrapper[4825]: I1007 19:34:27.104907 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zvlf4"] Oct 07 19:34:27 crc kubenswrapper[4825]: I1007 19:34:27.185607 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f131737-2f28-456f-b36f-1a7caaeaec60-catalog-content\") pod \"redhat-marketplace-zvlf4\" (UID: \"1f131737-2f28-456f-b36f-1a7caaeaec60\") " pod="openshift-marketplace/redhat-marketplace-zvlf4" Oct 07 19:34:27 crc kubenswrapper[4825]: I1007 19:34:27.185766 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f131737-2f28-456f-b36f-1a7caaeaec60-utilities\") pod \"redhat-marketplace-zvlf4\" (UID: \"1f131737-2f28-456f-b36f-1a7caaeaec60\") " pod="openshift-marketplace/redhat-marketplace-zvlf4" Oct 07 19:34:27 crc kubenswrapper[4825]: I1007 19:34:27.185823 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r45lf\" (UniqueName: \"kubernetes.io/projected/1f131737-2f28-456f-b36f-1a7caaeaec60-kube-api-access-r45lf\") pod \"redhat-marketplace-zvlf4\" (UID: \"1f131737-2f28-456f-b36f-1a7caaeaec60\") " pod="openshift-marketplace/redhat-marketplace-zvlf4" Oct 07 19:34:27 crc kubenswrapper[4825]: I1007 19:34:27.287305 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f131737-2f28-456f-b36f-1a7caaeaec60-utilities\") pod \"redhat-marketplace-zvlf4\" (UID: \"1f131737-2f28-456f-b36f-1a7caaeaec60\") " pod="openshift-marketplace/redhat-marketplace-zvlf4" Oct 07 19:34:27 crc kubenswrapper[4825]: I1007 19:34:27.287356 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r45lf\" (UniqueName: \"kubernetes.io/projected/1f131737-2f28-456f-b36f-1a7caaeaec60-kube-api-access-r45lf\") pod \"redhat-marketplace-zvlf4\" (UID: \"1f131737-2f28-456f-b36f-1a7caaeaec60\") " pod="openshift-marketplace/redhat-marketplace-zvlf4" Oct 07 19:34:27 crc kubenswrapper[4825]: I1007 19:34:27.287463 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f131737-2f28-456f-b36f-1a7caaeaec60-catalog-content\") pod \"redhat-marketplace-zvlf4\" (UID: \"1f131737-2f28-456f-b36f-1a7caaeaec60\") " pod="openshift-marketplace/redhat-marketplace-zvlf4" Oct 07 19:34:27 crc kubenswrapper[4825]: I1007 19:34:27.287909 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f131737-2f28-456f-b36f-1a7caaeaec60-utilities\") pod \"redhat-marketplace-zvlf4\" (UID: \"1f131737-2f28-456f-b36f-1a7caaeaec60\") " pod="openshift-marketplace/redhat-marketplace-zvlf4" Oct 07 19:34:27 crc kubenswrapper[4825]: I1007 19:34:27.287923 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f131737-2f28-456f-b36f-1a7caaeaec60-catalog-content\") pod \"redhat-marketplace-zvlf4\" (UID: \"1f131737-2f28-456f-b36f-1a7caaeaec60\") " pod="openshift-marketplace/redhat-marketplace-zvlf4" Oct 07 19:34:27 crc kubenswrapper[4825]: I1007 19:34:27.316985 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r45lf\" (UniqueName: \"kubernetes.io/projected/1f131737-2f28-456f-b36f-1a7caaeaec60-kube-api-access-r45lf\") pod \"redhat-marketplace-zvlf4\" (UID: \"1f131737-2f28-456f-b36f-1a7caaeaec60\") " pod="openshift-marketplace/redhat-marketplace-zvlf4" Oct 07 19:34:27 crc kubenswrapper[4825]: I1007 19:34:27.411996 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zvlf4" Oct 07 19:34:27 crc kubenswrapper[4825]: I1007 19:34:27.881146 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zvlf4"] Oct 07 19:34:28 crc kubenswrapper[4825]: I1007 19:34:28.335697 4825 generic.go:334] "Generic (PLEG): container finished" podID="a32b4f65-af6c-4bed-a97c-ec9ced0b4c45" containerID="1703b7c8c88d06bdc606a9dc7b0081b07b3220669681c2cf7d05f56b56195d38" exitCode=0 Oct 07 19:34:28 crc kubenswrapper[4825]: I1007 19:34:28.335830 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz" event={"ID":"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45","Type":"ContainerDied","Data":"1703b7c8c88d06bdc606a9dc7b0081b07b3220669681c2cf7d05f56b56195d38"} Oct 07 19:34:28 crc kubenswrapper[4825]: I1007 19:34:28.341626 4825 generic.go:334] "Generic (PLEG): container finished" podID="1f131737-2f28-456f-b36f-1a7caaeaec60" containerID="c63f628d3d4efe1073475278fa9f30dec45144b7743fc8d47ef8939d3c7f3cb1" exitCode=0 Oct 07 19:34:28 crc kubenswrapper[4825]: I1007 19:34:28.341685 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zvlf4" event={"ID":"1f131737-2f28-456f-b36f-1a7caaeaec60","Type":"ContainerDied","Data":"c63f628d3d4efe1073475278fa9f30dec45144b7743fc8d47ef8939d3c7f3cb1"} Oct 07 19:34:28 crc kubenswrapper[4825]: I1007 19:34:28.341716 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zvlf4" event={"ID":"1f131737-2f28-456f-b36f-1a7caaeaec60","Type":"ContainerStarted","Data":"9508d0d2b6a21da56ba08252992fd572c46eff8b1b3d522065b33d324998f68a"} Oct 07 19:34:29 crc kubenswrapper[4825]: I1007 19:34:29.373689 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zvlf4" event={"ID":"1f131737-2f28-456f-b36f-1a7caaeaec60","Type":"ContainerStarted","Data":"b2e59ca5e02d4868bf92854645a6f4cc88a776c98d19e8cc2c9e9291c4bc82df"} Oct 07 19:34:29 crc kubenswrapper[4825]: I1007 19:34:29.763945 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz" Oct 07 19:34:29 crc kubenswrapper[4825]: I1007 19:34:29.942575 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdhwn\" (UniqueName: \"kubernetes.io/projected/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-kube-api-access-mdhwn\") pod \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\" (UID: \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\") " Oct 07 19:34:29 crc kubenswrapper[4825]: I1007 19:34:29.942663 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-ssh-key\") pod \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\" (UID: \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\") " Oct 07 19:34:29 crc kubenswrapper[4825]: I1007 19:34:29.942854 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-inventory\") pod \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\" (UID: \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\") " Oct 07 19:34:29 crc kubenswrapper[4825]: I1007 19:34:29.942954 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-ovncontroller-config-0\") pod \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\" (UID: \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\") " Oct 07 19:34:29 crc kubenswrapper[4825]: I1007 19:34:29.943272 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-ovn-combined-ca-bundle\") pod \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\" (UID: \"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45\") " Oct 07 19:34:29 crc kubenswrapper[4825]: I1007 19:34:29.950950 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "a32b4f65-af6c-4bed-a97c-ec9ced0b4c45" (UID: "a32b4f65-af6c-4bed-a97c-ec9ced0b4c45"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:34:29 crc kubenswrapper[4825]: I1007 19:34:29.959716 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-kube-api-access-mdhwn" (OuterVolumeSpecName: "kube-api-access-mdhwn") pod "a32b4f65-af6c-4bed-a97c-ec9ced0b4c45" (UID: "a32b4f65-af6c-4bed-a97c-ec9ced0b4c45"). InnerVolumeSpecName "kube-api-access-mdhwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:34:29 crc kubenswrapper[4825]: I1007 19:34:29.975926 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "a32b4f65-af6c-4bed-a97c-ec9ced0b4c45" (UID: "a32b4f65-af6c-4bed-a97c-ec9ced0b4c45"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:34:29 crc kubenswrapper[4825]: I1007 19:34:29.993731 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a32b4f65-af6c-4bed-a97c-ec9ced0b4c45" (UID: "a32b4f65-af6c-4bed-a97c-ec9ced0b4c45"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:34:29 crc kubenswrapper[4825]: I1007 19:34:29.997275 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-inventory" (OuterVolumeSpecName: "inventory") pod "a32b4f65-af6c-4bed-a97c-ec9ced0b4c45" (UID: "a32b4f65-af6c-4bed-a97c-ec9ced0b4c45"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.046415 4825 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.046456 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdhwn\" (UniqueName: \"kubernetes.io/projected/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-kube-api-access-mdhwn\") on node \"crc\" DevicePath \"\"" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.046473 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.046488 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.046570 4825 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a32b4f65-af6c-4bed-a97c-ec9ced0b4c45-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.390166 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz" event={"ID":"a32b4f65-af6c-4bed-a97c-ec9ced0b4c45","Type":"ContainerDied","Data":"717c376dbf15cde58349880ecbeb1e63776e4373e41402c8c08446c6bb3400b1"} Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.390269 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="717c376dbf15cde58349880ecbeb1e63776e4373e41402c8c08446c6bb3400b1" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.390188 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxqvz" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.396589 4825 generic.go:334] "Generic (PLEG): container finished" podID="1f131737-2f28-456f-b36f-1a7caaeaec60" containerID="b2e59ca5e02d4868bf92854645a6f4cc88a776c98d19e8cc2c9e9291c4bc82df" exitCode=0 Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.396647 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zvlf4" event={"ID":"1f131737-2f28-456f-b36f-1a7caaeaec60","Type":"ContainerDied","Data":"b2e59ca5e02d4868bf92854645a6f4cc88a776c98d19e8cc2c9e9291c4bc82df"} Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.492887 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759"] Oct 07 19:34:30 crc kubenswrapper[4825]: E1007 19:34:30.493716 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a32b4f65-af6c-4bed-a97c-ec9ced0b4c45" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.493790 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a32b4f65-af6c-4bed-a97c-ec9ced0b4c45" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.494310 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a32b4f65-af6c-4bed-a97c-ec9ced0b4c45" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.495469 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.498733 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.498963 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.499213 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.499591 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lr8sm" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.499601 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.500351 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.511263 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759"] Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.663702 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759\" (UID: \"e22aff57-c4de-445a-b196-23d2e791a10f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.663759 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759\" (UID: \"e22aff57-c4de-445a-b196-23d2e791a10f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.663786 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759\" (UID: \"e22aff57-c4de-445a-b196-23d2e791a10f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.663806 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759\" (UID: \"e22aff57-c4de-445a-b196-23d2e791a10f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.664246 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759\" (UID: \"e22aff57-c4de-445a-b196-23d2e791a10f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.664348 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9q2h\" (UniqueName: \"kubernetes.io/projected/e22aff57-c4de-445a-b196-23d2e791a10f-kube-api-access-g9q2h\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759\" (UID: \"e22aff57-c4de-445a-b196-23d2e791a10f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.766621 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759\" (UID: \"e22aff57-c4de-445a-b196-23d2e791a10f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.766704 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759\" (UID: \"e22aff57-c4de-445a-b196-23d2e791a10f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.766755 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759\" (UID: \"e22aff57-c4de-445a-b196-23d2e791a10f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.766796 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759\" (UID: \"e22aff57-c4de-445a-b196-23d2e791a10f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.766903 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759\" (UID: \"e22aff57-c4de-445a-b196-23d2e791a10f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.766947 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9q2h\" (UniqueName: \"kubernetes.io/projected/e22aff57-c4de-445a-b196-23d2e791a10f-kube-api-access-g9q2h\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759\" (UID: \"e22aff57-c4de-445a-b196-23d2e791a10f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.770582 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759\" (UID: \"e22aff57-c4de-445a-b196-23d2e791a10f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.773031 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759\" (UID: \"e22aff57-c4de-445a-b196-23d2e791a10f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.773103 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759\" (UID: \"e22aff57-c4de-445a-b196-23d2e791a10f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.773611 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759\" (UID: \"e22aff57-c4de-445a-b196-23d2e791a10f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.774549 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759\" (UID: \"e22aff57-c4de-445a-b196-23d2e791a10f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.791098 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9q2h\" (UniqueName: \"kubernetes.io/projected/e22aff57-c4de-445a-b196-23d2e791a10f-kube-api-access-g9q2h\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759\" (UID: \"e22aff57-c4de-445a-b196-23d2e791a10f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" Oct 07 19:34:30 crc kubenswrapper[4825]: I1007 19:34:30.818578 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" Oct 07 19:34:31 crc kubenswrapper[4825]: I1007 19:34:31.406969 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759"] Oct 07 19:34:31 crc kubenswrapper[4825]: W1007 19:34:31.421295 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode22aff57_c4de_445a_b196_23d2e791a10f.slice/crio-327ae2f9f68873f96863d07c099889e6cd8f84abeed953640ea8f8266e5eea0a WatchSource:0}: Error finding container 327ae2f9f68873f96863d07c099889e6cd8f84abeed953640ea8f8266e5eea0a: Status 404 returned error can't find the container with id 327ae2f9f68873f96863d07c099889e6cd8f84abeed953640ea8f8266e5eea0a Oct 07 19:34:31 crc kubenswrapper[4825]: I1007 19:34:31.422397 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zvlf4" event={"ID":"1f131737-2f28-456f-b36f-1a7caaeaec60","Type":"ContainerStarted","Data":"2bec7a46162d5b5886343560a391181cf882ace060c02f729ed7b27c7f624622"} Oct 07 19:34:31 crc kubenswrapper[4825]: I1007 19:34:31.461695 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zvlf4" podStartSLOduration=1.9304503899999998 podStartE2EDuration="4.461676601s" podCreationTimestamp="2025-10-07 19:34:27 +0000 UTC" firstStartedPulling="2025-10-07 19:34:28.345334505 +0000 UTC m=+2057.167373182" lastFinishedPulling="2025-10-07 19:34:30.876560736 +0000 UTC m=+2059.698599393" observedRunningTime="2025-10-07 19:34:31.451865699 +0000 UTC m=+2060.273904336" watchObservedRunningTime="2025-10-07 19:34:31.461676601 +0000 UTC m=+2060.283715248" Oct 07 19:34:32 crc kubenswrapper[4825]: I1007 19:34:32.438984 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" event={"ID":"e22aff57-c4de-445a-b196-23d2e791a10f","Type":"ContainerStarted","Data":"45f5755198e35875182ebdec1f080048bbb6c4b5ff85fc096dbf70a247adfd62"} Oct 07 19:34:32 crc kubenswrapper[4825]: I1007 19:34:32.478346 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" event={"ID":"e22aff57-c4de-445a-b196-23d2e791a10f","Type":"ContainerStarted","Data":"327ae2f9f68873f96863d07c099889e6cd8f84abeed953640ea8f8266e5eea0a"} Oct 07 19:34:32 crc kubenswrapper[4825]: I1007 19:34:32.484827 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" podStartSLOduration=2.325512373 podStartE2EDuration="2.484803498s" podCreationTimestamp="2025-10-07 19:34:30 +0000 UTC" firstStartedPulling="2025-10-07 19:34:31.427279233 +0000 UTC m=+2060.249317880" lastFinishedPulling="2025-10-07 19:34:31.586570358 +0000 UTC m=+2060.408609005" observedRunningTime="2025-10-07 19:34:32.472849154 +0000 UTC m=+2061.294887891" watchObservedRunningTime="2025-10-07 19:34:32.484803498 +0000 UTC m=+2061.306842145" Oct 07 19:34:37 crc kubenswrapper[4825]: I1007 19:34:37.413182 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zvlf4" Oct 07 19:34:37 crc kubenswrapper[4825]: I1007 19:34:37.413953 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zvlf4" Oct 07 19:34:37 crc kubenswrapper[4825]: I1007 19:34:37.501446 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zvlf4" Oct 07 19:34:37 crc kubenswrapper[4825]: I1007 19:34:37.590103 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zvlf4" Oct 07 19:34:37 crc kubenswrapper[4825]: I1007 19:34:37.760650 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zvlf4"] Oct 07 19:34:39 crc kubenswrapper[4825]: I1007 19:34:39.521579 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zvlf4" podUID="1f131737-2f28-456f-b36f-1a7caaeaec60" containerName="registry-server" containerID="cri-o://2bec7a46162d5b5886343560a391181cf882ace060c02f729ed7b27c7f624622" gracePeriod=2 Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.020174 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zvlf4" Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.177855 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r45lf\" (UniqueName: \"kubernetes.io/projected/1f131737-2f28-456f-b36f-1a7caaeaec60-kube-api-access-r45lf\") pod \"1f131737-2f28-456f-b36f-1a7caaeaec60\" (UID: \"1f131737-2f28-456f-b36f-1a7caaeaec60\") " Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.178079 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f131737-2f28-456f-b36f-1a7caaeaec60-catalog-content\") pod \"1f131737-2f28-456f-b36f-1a7caaeaec60\" (UID: \"1f131737-2f28-456f-b36f-1a7caaeaec60\") " Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.178126 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f131737-2f28-456f-b36f-1a7caaeaec60-utilities\") pod \"1f131737-2f28-456f-b36f-1a7caaeaec60\" (UID: \"1f131737-2f28-456f-b36f-1a7caaeaec60\") " Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.178924 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f131737-2f28-456f-b36f-1a7caaeaec60-utilities" (OuterVolumeSpecName: "utilities") pod "1f131737-2f28-456f-b36f-1a7caaeaec60" (UID: "1f131737-2f28-456f-b36f-1a7caaeaec60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.186448 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f131737-2f28-456f-b36f-1a7caaeaec60-kube-api-access-r45lf" (OuterVolumeSpecName: "kube-api-access-r45lf") pod "1f131737-2f28-456f-b36f-1a7caaeaec60" (UID: "1f131737-2f28-456f-b36f-1a7caaeaec60"). InnerVolumeSpecName "kube-api-access-r45lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.198146 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f131737-2f28-456f-b36f-1a7caaeaec60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f131737-2f28-456f-b36f-1a7caaeaec60" (UID: "1f131737-2f28-456f-b36f-1a7caaeaec60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.280402 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r45lf\" (UniqueName: \"kubernetes.io/projected/1f131737-2f28-456f-b36f-1a7caaeaec60-kube-api-access-r45lf\") on node \"crc\" DevicePath \"\"" Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.280438 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f131737-2f28-456f-b36f-1a7caaeaec60-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.280453 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f131737-2f28-456f-b36f-1a7caaeaec60-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.532280 4825 generic.go:334] "Generic (PLEG): container finished" podID="1f131737-2f28-456f-b36f-1a7caaeaec60" containerID="2bec7a46162d5b5886343560a391181cf882ace060c02f729ed7b27c7f624622" exitCode=0 Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.532332 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zvlf4" event={"ID":"1f131737-2f28-456f-b36f-1a7caaeaec60","Type":"ContainerDied","Data":"2bec7a46162d5b5886343560a391181cf882ace060c02f729ed7b27c7f624622"} Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.532361 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zvlf4" Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.532376 4825 scope.go:117] "RemoveContainer" containerID="2bec7a46162d5b5886343560a391181cf882ace060c02f729ed7b27c7f624622" Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.532364 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zvlf4" event={"ID":"1f131737-2f28-456f-b36f-1a7caaeaec60","Type":"ContainerDied","Data":"9508d0d2b6a21da56ba08252992fd572c46eff8b1b3d522065b33d324998f68a"} Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.567548 4825 scope.go:117] "RemoveContainer" containerID="b2e59ca5e02d4868bf92854645a6f4cc88a776c98d19e8cc2c9e9291c4bc82df" Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.576068 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zvlf4"] Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.585181 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zvlf4"] Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.595305 4825 scope.go:117] "RemoveContainer" containerID="c63f628d3d4efe1073475278fa9f30dec45144b7743fc8d47ef8939d3c7f3cb1" Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.659656 4825 scope.go:117] "RemoveContainer" containerID="2bec7a46162d5b5886343560a391181cf882ace060c02f729ed7b27c7f624622" Oct 07 19:34:40 crc kubenswrapper[4825]: E1007 19:34:40.660386 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bec7a46162d5b5886343560a391181cf882ace060c02f729ed7b27c7f624622\": container with ID starting with 2bec7a46162d5b5886343560a391181cf882ace060c02f729ed7b27c7f624622 not found: ID does not exist" containerID="2bec7a46162d5b5886343560a391181cf882ace060c02f729ed7b27c7f624622" Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.660414 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bec7a46162d5b5886343560a391181cf882ace060c02f729ed7b27c7f624622"} err="failed to get container status \"2bec7a46162d5b5886343560a391181cf882ace060c02f729ed7b27c7f624622\": rpc error: code = NotFound desc = could not find container \"2bec7a46162d5b5886343560a391181cf882ace060c02f729ed7b27c7f624622\": container with ID starting with 2bec7a46162d5b5886343560a391181cf882ace060c02f729ed7b27c7f624622 not found: ID does not exist" Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.660436 4825 scope.go:117] "RemoveContainer" containerID="b2e59ca5e02d4868bf92854645a6f4cc88a776c98d19e8cc2c9e9291c4bc82df" Oct 07 19:34:40 crc kubenswrapper[4825]: E1007 19:34:40.660848 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2e59ca5e02d4868bf92854645a6f4cc88a776c98d19e8cc2c9e9291c4bc82df\": container with ID starting with b2e59ca5e02d4868bf92854645a6f4cc88a776c98d19e8cc2c9e9291c4bc82df not found: ID does not exist" containerID="b2e59ca5e02d4868bf92854645a6f4cc88a776c98d19e8cc2c9e9291c4bc82df" Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.660885 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2e59ca5e02d4868bf92854645a6f4cc88a776c98d19e8cc2c9e9291c4bc82df"} err="failed to get container status \"b2e59ca5e02d4868bf92854645a6f4cc88a776c98d19e8cc2c9e9291c4bc82df\": rpc error: code = NotFound desc = could not find container \"b2e59ca5e02d4868bf92854645a6f4cc88a776c98d19e8cc2c9e9291c4bc82df\": container with ID starting with b2e59ca5e02d4868bf92854645a6f4cc88a776c98d19e8cc2c9e9291c4bc82df not found: ID does not exist" Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.660900 4825 scope.go:117] "RemoveContainer" containerID="c63f628d3d4efe1073475278fa9f30dec45144b7743fc8d47ef8939d3c7f3cb1" Oct 07 19:34:40 crc kubenswrapper[4825]: E1007 19:34:40.661305 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c63f628d3d4efe1073475278fa9f30dec45144b7743fc8d47ef8939d3c7f3cb1\": container with ID starting with c63f628d3d4efe1073475278fa9f30dec45144b7743fc8d47ef8939d3c7f3cb1 not found: ID does not exist" containerID="c63f628d3d4efe1073475278fa9f30dec45144b7743fc8d47ef8939d3c7f3cb1" Oct 07 19:34:40 crc kubenswrapper[4825]: I1007 19:34:40.661383 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c63f628d3d4efe1073475278fa9f30dec45144b7743fc8d47ef8939d3c7f3cb1"} err="failed to get container status \"c63f628d3d4efe1073475278fa9f30dec45144b7743fc8d47ef8939d3c7f3cb1\": rpc error: code = NotFound desc = could not find container \"c63f628d3d4efe1073475278fa9f30dec45144b7743fc8d47ef8939d3c7f3cb1\": container with ID starting with c63f628d3d4efe1073475278fa9f30dec45144b7743fc8d47ef8939d3c7f3cb1 not found: ID does not exist" Oct 07 19:34:41 crc kubenswrapper[4825]: I1007 19:34:41.815564 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f131737-2f28-456f-b36f-1a7caaeaec60" path="/var/lib/kubelet/pods/1f131737-2f28-456f-b36f-1a7caaeaec60/volumes" Oct 07 19:34:59 crc kubenswrapper[4825]: I1007 19:34:59.168714 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gdls5"] Oct 07 19:34:59 crc kubenswrapper[4825]: E1007 19:34:59.169649 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f131737-2f28-456f-b36f-1a7caaeaec60" containerName="extract-utilities" Oct 07 19:34:59 crc kubenswrapper[4825]: I1007 19:34:59.169665 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f131737-2f28-456f-b36f-1a7caaeaec60" containerName="extract-utilities" Oct 07 19:34:59 crc kubenswrapper[4825]: E1007 19:34:59.169711 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f131737-2f28-456f-b36f-1a7caaeaec60" containerName="registry-server" Oct 07 19:34:59 crc kubenswrapper[4825]: I1007 19:34:59.169720 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f131737-2f28-456f-b36f-1a7caaeaec60" containerName="registry-server" Oct 07 19:34:59 crc kubenswrapper[4825]: E1007 19:34:59.169750 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f131737-2f28-456f-b36f-1a7caaeaec60" containerName="extract-content" Oct 07 19:34:59 crc kubenswrapper[4825]: I1007 19:34:59.169758 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f131737-2f28-456f-b36f-1a7caaeaec60" containerName="extract-content" Oct 07 19:34:59 crc kubenswrapper[4825]: I1007 19:34:59.169999 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f131737-2f28-456f-b36f-1a7caaeaec60" containerName="registry-server" Oct 07 19:34:59 crc kubenswrapper[4825]: I1007 19:34:59.172655 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gdls5" Oct 07 19:34:59 crc kubenswrapper[4825]: I1007 19:34:59.203099 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gdls5"] Oct 07 19:34:59 crc kubenswrapper[4825]: I1007 19:34:59.263426 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/707f4130-70a2-4161-80c6-d5767bf6752e-catalog-content\") pod \"community-operators-gdls5\" (UID: \"707f4130-70a2-4161-80c6-d5767bf6752e\") " pod="openshift-marketplace/community-operators-gdls5" Oct 07 19:34:59 crc kubenswrapper[4825]: I1007 19:34:59.263526 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tw7l\" (UniqueName: \"kubernetes.io/projected/707f4130-70a2-4161-80c6-d5767bf6752e-kube-api-access-4tw7l\") pod \"community-operators-gdls5\" (UID: \"707f4130-70a2-4161-80c6-d5767bf6752e\") " pod="openshift-marketplace/community-operators-gdls5" Oct 07 19:34:59 crc kubenswrapper[4825]: I1007 19:34:59.263593 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/707f4130-70a2-4161-80c6-d5767bf6752e-utilities\") pod \"community-operators-gdls5\" (UID: \"707f4130-70a2-4161-80c6-d5767bf6752e\") " pod="openshift-marketplace/community-operators-gdls5" Oct 07 19:34:59 crc kubenswrapper[4825]: I1007 19:34:59.365093 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/707f4130-70a2-4161-80c6-d5767bf6752e-catalog-content\") pod \"community-operators-gdls5\" (UID: \"707f4130-70a2-4161-80c6-d5767bf6752e\") " pod="openshift-marketplace/community-operators-gdls5" Oct 07 19:34:59 crc kubenswrapper[4825]: I1007 19:34:59.365210 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tw7l\" (UniqueName: \"kubernetes.io/projected/707f4130-70a2-4161-80c6-d5767bf6752e-kube-api-access-4tw7l\") pod \"community-operators-gdls5\" (UID: \"707f4130-70a2-4161-80c6-d5767bf6752e\") " pod="openshift-marketplace/community-operators-gdls5" Oct 07 19:34:59 crc kubenswrapper[4825]: I1007 19:34:59.365390 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/707f4130-70a2-4161-80c6-d5767bf6752e-utilities\") pod \"community-operators-gdls5\" (UID: \"707f4130-70a2-4161-80c6-d5767bf6752e\") " pod="openshift-marketplace/community-operators-gdls5" Oct 07 19:34:59 crc kubenswrapper[4825]: I1007 19:34:59.366001 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/707f4130-70a2-4161-80c6-d5767bf6752e-utilities\") pod \"community-operators-gdls5\" (UID: \"707f4130-70a2-4161-80c6-d5767bf6752e\") " pod="openshift-marketplace/community-operators-gdls5" Oct 07 19:34:59 crc kubenswrapper[4825]: I1007 19:34:59.366309 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/707f4130-70a2-4161-80c6-d5767bf6752e-catalog-content\") pod \"community-operators-gdls5\" (UID: \"707f4130-70a2-4161-80c6-d5767bf6752e\") " pod="openshift-marketplace/community-operators-gdls5" Oct 07 19:34:59 crc kubenswrapper[4825]: I1007 19:34:59.396045 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tw7l\" (UniqueName: \"kubernetes.io/projected/707f4130-70a2-4161-80c6-d5767bf6752e-kube-api-access-4tw7l\") pod \"community-operators-gdls5\" (UID: \"707f4130-70a2-4161-80c6-d5767bf6752e\") " pod="openshift-marketplace/community-operators-gdls5" Oct 07 19:34:59 crc kubenswrapper[4825]: I1007 19:34:59.505635 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gdls5" Oct 07 19:34:59 crc kubenswrapper[4825]: I1007 19:34:59.999523 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gdls5"] Oct 07 19:35:00 crc kubenswrapper[4825]: I1007 19:35:00.746782 4825 generic.go:334] "Generic (PLEG): container finished" podID="707f4130-70a2-4161-80c6-d5767bf6752e" containerID="ab0ad28223963aea2eb959b40c0c954b85032669cf9892d4884ab922ed0b45f0" exitCode=0 Oct 07 19:35:00 crc kubenswrapper[4825]: I1007 19:35:00.746848 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdls5" event={"ID":"707f4130-70a2-4161-80c6-d5767bf6752e","Type":"ContainerDied","Data":"ab0ad28223963aea2eb959b40c0c954b85032669cf9892d4884ab922ed0b45f0"} Oct 07 19:35:00 crc kubenswrapper[4825]: I1007 19:35:00.746888 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdls5" event={"ID":"707f4130-70a2-4161-80c6-d5767bf6752e","Type":"ContainerStarted","Data":"66cf704242283b862b67adcda8b6774a95d52ae55ac005bea29a95a7c880b3d8"} Oct 07 19:35:03 crc kubenswrapper[4825]: I1007 19:35:03.533734 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x5kh7"] Oct 07 19:35:03 crc kubenswrapper[4825]: I1007 19:35:03.536642 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x5kh7" Oct 07 19:35:03 crc kubenswrapper[4825]: I1007 19:35:03.544777 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x5kh7"] Oct 07 19:35:03 crc kubenswrapper[4825]: I1007 19:35:03.673609 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a065512e-39ae-4763-b562-fb77245458ae-catalog-content\") pod \"certified-operators-x5kh7\" (UID: \"a065512e-39ae-4763-b562-fb77245458ae\") " pod="openshift-marketplace/certified-operators-x5kh7" Oct 07 19:35:03 crc kubenswrapper[4825]: I1007 19:35:03.673710 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9t4z\" (UniqueName: \"kubernetes.io/projected/a065512e-39ae-4763-b562-fb77245458ae-kube-api-access-b9t4z\") pod \"certified-operators-x5kh7\" (UID: \"a065512e-39ae-4763-b562-fb77245458ae\") " pod="openshift-marketplace/certified-operators-x5kh7" Oct 07 19:35:03 crc kubenswrapper[4825]: I1007 19:35:03.673744 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a065512e-39ae-4763-b562-fb77245458ae-utilities\") pod \"certified-operators-x5kh7\" (UID: \"a065512e-39ae-4763-b562-fb77245458ae\") " pod="openshift-marketplace/certified-operators-x5kh7" Oct 07 19:35:03 crc kubenswrapper[4825]: I1007 19:35:03.775684 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a065512e-39ae-4763-b562-fb77245458ae-catalog-content\") pod \"certified-operators-x5kh7\" (UID: \"a065512e-39ae-4763-b562-fb77245458ae\") " pod="openshift-marketplace/certified-operators-x5kh7" Oct 07 19:35:03 crc kubenswrapper[4825]: I1007 19:35:03.775741 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9t4z\" (UniqueName: \"kubernetes.io/projected/a065512e-39ae-4763-b562-fb77245458ae-kube-api-access-b9t4z\") pod \"certified-operators-x5kh7\" (UID: \"a065512e-39ae-4763-b562-fb77245458ae\") " pod="openshift-marketplace/certified-operators-x5kh7" Oct 07 19:35:03 crc kubenswrapper[4825]: I1007 19:35:03.775764 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a065512e-39ae-4763-b562-fb77245458ae-utilities\") pod \"certified-operators-x5kh7\" (UID: \"a065512e-39ae-4763-b562-fb77245458ae\") " pod="openshift-marketplace/certified-operators-x5kh7" Oct 07 19:35:03 crc kubenswrapper[4825]: I1007 19:35:03.776259 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a065512e-39ae-4763-b562-fb77245458ae-catalog-content\") pod \"certified-operators-x5kh7\" (UID: \"a065512e-39ae-4763-b562-fb77245458ae\") " pod="openshift-marketplace/certified-operators-x5kh7" Oct 07 19:35:03 crc kubenswrapper[4825]: I1007 19:35:03.776289 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a065512e-39ae-4763-b562-fb77245458ae-utilities\") pod \"certified-operators-x5kh7\" (UID: \"a065512e-39ae-4763-b562-fb77245458ae\") " pod="openshift-marketplace/certified-operators-x5kh7" Oct 07 19:35:03 crc kubenswrapper[4825]: I1007 19:35:03.797078 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9t4z\" (UniqueName: \"kubernetes.io/projected/a065512e-39ae-4763-b562-fb77245458ae-kube-api-access-b9t4z\") pod \"certified-operators-x5kh7\" (UID: \"a065512e-39ae-4763-b562-fb77245458ae\") " pod="openshift-marketplace/certified-operators-x5kh7" Oct 07 19:35:03 crc kubenswrapper[4825]: I1007 19:35:03.870900 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x5kh7" Oct 07 19:35:05 crc kubenswrapper[4825]: I1007 19:35:05.709395 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:35:05 crc kubenswrapper[4825]: I1007 19:35:05.709718 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:35:05 crc kubenswrapper[4825]: I1007 19:35:05.968887 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x5kh7"] Oct 07 19:35:06 crc kubenswrapper[4825]: E1007 19:35:06.454692 4825 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod707f4130_70a2_4161_80c6_d5767bf6752e.slice/crio-conmon-0510c009bf12f5dd9c9f8bafd4b2a0689e251e98850a8cfd96c286f88ad94059.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod707f4130_70a2_4161_80c6_d5767bf6752e.slice/crio-0510c009bf12f5dd9c9f8bafd4b2a0689e251e98850a8cfd96c286f88ad94059.scope\": RecentStats: unable to find data in memory cache]" Oct 07 19:35:06 crc kubenswrapper[4825]: I1007 19:35:06.810965 4825 generic.go:334] "Generic (PLEG): container finished" podID="707f4130-70a2-4161-80c6-d5767bf6752e" containerID="0510c009bf12f5dd9c9f8bafd4b2a0689e251e98850a8cfd96c286f88ad94059" exitCode=0 Oct 07 19:35:06 crc kubenswrapper[4825]: I1007 19:35:06.811111 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdls5" event={"ID":"707f4130-70a2-4161-80c6-d5767bf6752e","Type":"ContainerDied","Data":"0510c009bf12f5dd9c9f8bafd4b2a0689e251e98850a8cfd96c286f88ad94059"} Oct 07 19:35:06 crc kubenswrapper[4825]: I1007 19:35:06.814216 4825 generic.go:334] "Generic (PLEG): container finished" podID="a065512e-39ae-4763-b562-fb77245458ae" containerID="572a89df4e560ee4849f481dec7884f3931b8fea08fc8b934add8b017a20cfa5" exitCode=0 Oct 07 19:35:06 crc kubenswrapper[4825]: I1007 19:35:06.814372 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5kh7" event={"ID":"a065512e-39ae-4763-b562-fb77245458ae","Type":"ContainerDied","Data":"572a89df4e560ee4849f481dec7884f3931b8fea08fc8b934add8b017a20cfa5"} Oct 07 19:35:06 crc kubenswrapper[4825]: I1007 19:35:06.814424 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5kh7" event={"ID":"a065512e-39ae-4763-b562-fb77245458ae","Type":"ContainerStarted","Data":"959d7cfebca01fd0cacb4bd36367b930b274ab0c0704cba43b2576618c1c392a"} Oct 07 19:35:07 crc kubenswrapper[4825]: I1007 19:35:07.834640 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdls5" event={"ID":"707f4130-70a2-4161-80c6-d5767bf6752e","Type":"ContainerStarted","Data":"9ab4acb9e09533bc4859477c20e6b82e746f1904918972782c2efedc12dcdc8a"} Oct 07 19:35:07 crc kubenswrapper[4825]: I1007 19:35:07.837726 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5kh7" event={"ID":"a065512e-39ae-4763-b562-fb77245458ae","Type":"ContainerStarted","Data":"b3be5ac1243eff8db21f5da76e4d94fb8066bb0c3b697ee7ab0fcbee235145fa"} Oct 07 19:35:07 crc kubenswrapper[4825]: I1007 19:35:07.853948 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gdls5" podStartSLOduration=2.299747134 podStartE2EDuration="8.85392988s" podCreationTimestamp="2025-10-07 19:34:59 +0000 UTC" firstStartedPulling="2025-10-07 19:35:00.749756999 +0000 UTC m=+2089.571795666" lastFinishedPulling="2025-10-07 19:35:07.303939735 +0000 UTC m=+2096.125978412" observedRunningTime="2025-10-07 19:35:07.851529355 +0000 UTC m=+2096.673568012" watchObservedRunningTime="2025-10-07 19:35:07.85392988 +0000 UTC m=+2096.675968517" Oct 07 19:35:08 crc kubenswrapper[4825]: I1007 19:35:08.850122 4825 generic.go:334] "Generic (PLEG): container finished" podID="a065512e-39ae-4763-b562-fb77245458ae" containerID="b3be5ac1243eff8db21f5da76e4d94fb8066bb0c3b697ee7ab0fcbee235145fa" exitCode=0 Oct 07 19:35:08 crc kubenswrapper[4825]: I1007 19:35:08.851450 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5kh7" event={"ID":"a065512e-39ae-4763-b562-fb77245458ae","Type":"ContainerDied","Data":"b3be5ac1243eff8db21f5da76e4d94fb8066bb0c3b697ee7ab0fcbee235145fa"} Oct 07 19:35:09 crc kubenswrapper[4825]: I1007 19:35:09.506176 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gdls5" Oct 07 19:35:09 crc kubenswrapper[4825]: I1007 19:35:09.508486 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gdls5" Oct 07 19:35:10 crc kubenswrapper[4825]: I1007 19:35:10.583211 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-gdls5" podUID="707f4130-70a2-4161-80c6-d5767bf6752e" containerName="registry-server" probeResult="failure" output=< Oct 07 19:35:10 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Oct 07 19:35:10 crc kubenswrapper[4825]: > Oct 07 19:35:10 crc kubenswrapper[4825]: I1007 19:35:10.886984 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5kh7" event={"ID":"a065512e-39ae-4763-b562-fb77245458ae","Type":"ContainerStarted","Data":"c0ec007de100f418707d4f285651c1493c8d067ba6a4b1e70c8244ea3ca4e6c6"} Oct 07 19:35:10 crc kubenswrapper[4825]: I1007 19:35:10.909682 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x5kh7" podStartSLOduration=5.10447616 podStartE2EDuration="7.909640329s" podCreationTimestamp="2025-10-07 19:35:03 +0000 UTC" firstStartedPulling="2025-10-07 19:35:06.816930472 +0000 UTC m=+2095.638969159" lastFinishedPulling="2025-10-07 19:35:09.622094681 +0000 UTC m=+2098.444133328" observedRunningTime="2025-10-07 19:35:10.907614116 +0000 UTC m=+2099.729652803" watchObservedRunningTime="2025-10-07 19:35:10.909640329 +0000 UTC m=+2099.731678976" Oct 07 19:35:13 crc kubenswrapper[4825]: I1007 19:35:13.871775 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x5kh7" Oct 07 19:35:13 crc kubenswrapper[4825]: I1007 19:35:13.872549 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x5kh7" Oct 07 19:35:13 crc kubenswrapper[4825]: I1007 19:35:13.932071 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x5kh7" Oct 07 19:35:19 crc kubenswrapper[4825]: I1007 19:35:19.578518 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gdls5" Oct 07 19:35:19 crc kubenswrapper[4825]: I1007 19:35:19.647686 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gdls5" Oct 07 19:35:19 crc kubenswrapper[4825]: I1007 19:35:19.725425 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gdls5"] Oct 07 19:35:19 crc kubenswrapper[4825]: I1007 19:35:19.820837 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8bx2n"] Oct 07 19:35:19 crc kubenswrapper[4825]: I1007 19:35:19.821402 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8bx2n" podUID="6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4" containerName="registry-server" containerID="cri-o://5045471ec2dc1f7dc9915b4039d295d62f11fa66581d3331c12001a6a80c8936" gracePeriod=2 Oct 07 19:35:19 crc kubenswrapper[4825]: I1007 19:35:19.995892 4825 generic.go:334] "Generic (PLEG): container finished" podID="6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4" containerID="5045471ec2dc1f7dc9915b4039d295d62f11fa66581d3331c12001a6a80c8936" exitCode=0 Oct 07 19:35:19 crc kubenswrapper[4825]: I1007 19:35:19.996324 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bx2n" event={"ID":"6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4","Type":"ContainerDied","Data":"5045471ec2dc1f7dc9915b4039d295d62f11fa66581d3331c12001a6a80c8936"} Oct 07 19:35:20 crc kubenswrapper[4825]: I1007 19:35:20.368005 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8bx2n" Oct 07 19:35:20 crc kubenswrapper[4825]: I1007 19:35:20.448562 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4-catalog-content\") pod \"6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4\" (UID: \"6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4\") " Oct 07 19:35:20 crc kubenswrapper[4825]: I1007 19:35:20.448708 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxtp8\" (UniqueName: \"kubernetes.io/projected/6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4-kube-api-access-lxtp8\") pod \"6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4\" (UID: \"6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4\") " Oct 07 19:35:20 crc kubenswrapper[4825]: I1007 19:35:20.448743 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4-utilities\") pod \"6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4\" (UID: \"6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4\") " Oct 07 19:35:20 crc kubenswrapper[4825]: I1007 19:35:20.449850 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4-utilities" (OuterVolumeSpecName: "utilities") pod "6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4" (UID: "6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:35:20 crc kubenswrapper[4825]: I1007 19:35:20.456416 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4-kube-api-access-lxtp8" (OuterVolumeSpecName: "kube-api-access-lxtp8") pod "6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4" (UID: "6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4"). InnerVolumeSpecName "kube-api-access-lxtp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:35:20 crc kubenswrapper[4825]: I1007 19:35:20.505748 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4" (UID: "6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:35:20 crc kubenswrapper[4825]: I1007 19:35:20.550641 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:35:20 crc kubenswrapper[4825]: I1007 19:35:20.550670 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxtp8\" (UniqueName: \"kubernetes.io/projected/6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4-kube-api-access-lxtp8\") on node \"crc\" DevicePath \"\"" Oct 07 19:35:20 crc kubenswrapper[4825]: I1007 19:35:20.550680 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:35:21 crc kubenswrapper[4825]: I1007 19:35:21.006197 4825 generic.go:334] "Generic (PLEG): container finished" podID="e22aff57-c4de-445a-b196-23d2e791a10f" containerID="45f5755198e35875182ebdec1f080048bbb6c4b5ff85fc096dbf70a247adfd62" exitCode=0 Oct 07 19:35:21 crc kubenswrapper[4825]: I1007 19:35:21.006366 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" event={"ID":"e22aff57-c4de-445a-b196-23d2e791a10f","Type":"ContainerDied","Data":"45f5755198e35875182ebdec1f080048bbb6c4b5ff85fc096dbf70a247adfd62"} Oct 07 19:35:21 crc kubenswrapper[4825]: I1007 19:35:21.010527 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bx2n" event={"ID":"6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4","Type":"ContainerDied","Data":"b79ab1772e44b7da086161d1012a7d7e23f81ef854a456e72425390248f78bae"} Oct 07 19:35:21 crc kubenswrapper[4825]: I1007 19:35:21.010570 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8bx2n" Oct 07 19:35:21 crc kubenswrapper[4825]: I1007 19:35:21.010595 4825 scope.go:117] "RemoveContainer" containerID="5045471ec2dc1f7dc9915b4039d295d62f11fa66581d3331c12001a6a80c8936" Oct 07 19:35:21 crc kubenswrapper[4825]: I1007 19:35:21.038437 4825 scope.go:117] "RemoveContainer" containerID="79a4d81830edf082b54c7b16af98bb9019505ff001ca119d05e529d107cbd539" Oct 07 19:35:21 crc kubenswrapper[4825]: I1007 19:35:21.065835 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8bx2n"] Oct 07 19:35:21 crc kubenswrapper[4825]: I1007 19:35:21.070762 4825 scope.go:117] "RemoveContainer" containerID="43376bc96c34a2b3cb3a36c0b395e8c4556d498089cc15df2bf945924b395099" Oct 07 19:35:21 crc kubenswrapper[4825]: I1007 19:35:21.076594 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8bx2n"] Oct 07 19:35:21 crc kubenswrapper[4825]: I1007 19:35:21.810164 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4" path="/var/lib/kubelet/pods/6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4/volumes" Oct 07 19:35:22 crc kubenswrapper[4825]: I1007 19:35:22.495499 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" Oct 07 19:35:22 crc kubenswrapper[4825]: I1007 19:35:22.590285 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"e22aff57-c4de-445a-b196-23d2e791a10f\" (UID: \"e22aff57-c4de-445a-b196-23d2e791a10f\") " Oct 07 19:35:22 crc kubenswrapper[4825]: I1007 19:35:22.590370 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-ssh-key\") pod \"e22aff57-c4de-445a-b196-23d2e791a10f\" (UID: \"e22aff57-c4de-445a-b196-23d2e791a10f\") " Oct 07 19:35:22 crc kubenswrapper[4825]: I1007 19:35:22.590433 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-neutron-metadata-combined-ca-bundle\") pod \"e22aff57-c4de-445a-b196-23d2e791a10f\" (UID: \"e22aff57-c4de-445a-b196-23d2e791a10f\") " Oct 07 19:35:22 crc kubenswrapper[4825]: I1007 19:35:22.590525 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9q2h\" (UniqueName: \"kubernetes.io/projected/e22aff57-c4de-445a-b196-23d2e791a10f-kube-api-access-g9q2h\") pod \"e22aff57-c4de-445a-b196-23d2e791a10f\" (UID: \"e22aff57-c4de-445a-b196-23d2e791a10f\") " Oct 07 19:35:22 crc kubenswrapper[4825]: I1007 19:35:22.590656 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-nova-metadata-neutron-config-0\") pod \"e22aff57-c4de-445a-b196-23d2e791a10f\" (UID: \"e22aff57-c4de-445a-b196-23d2e791a10f\") " Oct 07 19:35:22 crc kubenswrapper[4825]: I1007 19:35:22.590819 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-inventory\") pod \"e22aff57-c4de-445a-b196-23d2e791a10f\" (UID: \"e22aff57-c4de-445a-b196-23d2e791a10f\") " Oct 07 19:35:22 crc kubenswrapper[4825]: I1007 19:35:22.596063 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e22aff57-c4de-445a-b196-23d2e791a10f-kube-api-access-g9q2h" (OuterVolumeSpecName: "kube-api-access-g9q2h") pod "e22aff57-c4de-445a-b196-23d2e791a10f" (UID: "e22aff57-c4de-445a-b196-23d2e791a10f"). InnerVolumeSpecName "kube-api-access-g9q2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:35:22 crc kubenswrapper[4825]: I1007 19:35:22.603515 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e22aff57-c4de-445a-b196-23d2e791a10f" (UID: "e22aff57-c4de-445a-b196-23d2e791a10f"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:35:22 crc kubenswrapper[4825]: I1007 19:35:22.625662 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "e22aff57-c4de-445a-b196-23d2e791a10f" (UID: "e22aff57-c4de-445a-b196-23d2e791a10f"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:35:22 crc kubenswrapper[4825]: I1007 19:35:22.627753 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e22aff57-c4de-445a-b196-23d2e791a10f" (UID: "e22aff57-c4de-445a-b196-23d2e791a10f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:35:22 crc kubenswrapper[4825]: I1007 19:35:22.648341 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "e22aff57-c4de-445a-b196-23d2e791a10f" (UID: "e22aff57-c4de-445a-b196-23d2e791a10f"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:35:22 crc kubenswrapper[4825]: I1007 19:35:22.652630 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-inventory" (OuterVolumeSpecName: "inventory") pod "e22aff57-c4de-445a-b196-23d2e791a10f" (UID: "e22aff57-c4de-445a-b196-23d2e791a10f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:35:22 crc kubenswrapper[4825]: I1007 19:35:22.695333 4825 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 19:35:22 crc kubenswrapper[4825]: I1007 19:35:22.695386 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 19:35:22 crc kubenswrapper[4825]: I1007 19:35:22.695406 4825 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:35:22 crc kubenswrapper[4825]: I1007 19:35:22.695428 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9q2h\" (UniqueName: \"kubernetes.io/projected/e22aff57-c4de-445a-b196-23d2e791a10f-kube-api-access-g9q2h\") on node \"crc\" DevicePath \"\"" Oct 07 19:35:22 crc kubenswrapper[4825]: I1007 19:35:22.695450 4825 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 19:35:22 crc kubenswrapper[4825]: I1007 19:35:22.695470 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e22aff57-c4de-445a-b196-23d2e791a10f-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.039020 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" event={"ID":"e22aff57-c4de-445a-b196-23d2e791a10f","Type":"ContainerDied","Data":"327ae2f9f68873f96863d07c099889e6cd8f84abeed953640ea8f8266e5eea0a"} Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.039067 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="327ae2f9f68873f96863d07c099889e6cd8f84abeed953640ea8f8266e5eea0a" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.039416 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.138523 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw"] Oct 07 19:35:23 crc kubenswrapper[4825]: E1007 19:35:23.139108 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4" containerName="extract-utilities" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.139139 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4" containerName="extract-utilities" Oct 07 19:35:23 crc kubenswrapper[4825]: E1007 19:35:23.139175 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22aff57-c4de-445a-b196-23d2e791a10f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.139192 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22aff57-c4de-445a-b196-23d2e791a10f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 07 19:35:23 crc kubenswrapper[4825]: E1007 19:35:23.139253 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4" containerName="registry-server" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.139279 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4" containerName="registry-server" Oct 07 19:35:23 crc kubenswrapper[4825]: E1007 19:35:23.139300 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4" containerName="extract-content" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.139314 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4" containerName="extract-content" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.139622 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22aff57-c4de-445a-b196-23d2e791a10f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.139673 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d9fb2ed-c6fb-49f1-aa0e-4085eb352ca4" containerName="registry-server" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.140714 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.143354 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.143953 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lr8sm" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.144970 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.145384 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.145792 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.149636 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw"] Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.202438 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a978ccd-af77-4892-9bae-0f87170eb4a1-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw\" (UID: \"6a978ccd-af77-4892-9bae-0f87170eb4a1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.202516 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwkx9\" (UniqueName: \"kubernetes.io/projected/6a978ccd-af77-4892-9bae-0f87170eb4a1-kube-api-access-qwkx9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw\" (UID: \"6a978ccd-af77-4892-9bae-0f87170eb4a1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.202570 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6a978ccd-af77-4892-9bae-0f87170eb4a1-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw\" (UID: \"6a978ccd-af77-4892-9bae-0f87170eb4a1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.202619 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a978ccd-af77-4892-9bae-0f87170eb4a1-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw\" (UID: \"6a978ccd-af77-4892-9bae-0f87170eb4a1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.202887 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a978ccd-af77-4892-9bae-0f87170eb4a1-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw\" (UID: \"6a978ccd-af77-4892-9bae-0f87170eb4a1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.304009 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a978ccd-af77-4892-9bae-0f87170eb4a1-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw\" (UID: \"6a978ccd-af77-4892-9bae-0f87170eb4a1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.304379 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a978ccd-af77-4892-9bae-0f87170eb4a1-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw\" (UID: \"6a978ccd-af77-4892-9bae-0f87170eb4a1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.304443 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a978ccd-af77-4892-9bae-0f87170eb4a1-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw\" (UID: \"6a978ccd-af77-4892-9bae-0f87170eb4a1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.304500 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwkx9\" (UniqueName: \"kubernetes.io/projected/6a978ccd-af77-4892-9bae-0f87170eb4a1-kube-api-access-qwkx9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw\" (UID: \"6a978ccd-af77-4892-9bae-0f87170eb4a1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.304549 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6a978ccd-af77-4892-9bae-0f87170eb4a1-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw\" (UID: \"6a978ccd-af77-4892-9bae-0f87170eb4a1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.309400 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6a978ccd-af77-4892-9bae-0f87170eb4a1-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw\" (UID: \"6a978ccd-af77-4892-9bae-0f87170eb4a1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.309892 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a978ccd-af77-4892-9bae-0f87170eb4a1-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw\" (UID: \"6a978ccd-af77-4892-9bae-0f87170eb4a1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.310489 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a978ccd-af77-4892-9bae-0f87170eb4a1-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw\" (UID: \"6a978ccd-af77-4892-9bae-0f87170eb4a1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.316220 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a978ccd-af77-4892-9bae-0f87170eb4a1-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw\" (UID: \"6a978ccd-af77-4892-9bae-0f87170eb4a1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.334599 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwkx9\" (UniqueName: \"kubernetes.io/projected/6a978ccd-af77-4892-9bae-0f87170eb4a1-kube-api-access-qwkx9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw\" (UID: \"6a978ccd-af77-4892-9bae-0f87170eb4a1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.483415 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw" Oct 07 19:35:23 crc kubenswrapper[4825]: I1007 19:35:23.968024 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x5kh7" Oct 07 19:35:24 crc kubenswrapper[4825]: I1007 19:35:24.053008 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x5kh7"] Oct 07 19:35:24 crc kubenswrapper[4825]: I1007 19:35:24.053367 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x5kh7" podUID="a065512e-39ae-4763-b562-fb77245458ae" containerName="registry-server" containerID="cri-o://c0ec007de100f418707d4f285651c1493c8d067ba6a4b1e70c8244ea3ca4e6c6" gracePeriod=2 Oct 07 19:35:24 crc kubenswrapper[4825]: I1007 19:35:24.132073 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw"] Oct 07 19:35:24 crc kubenswrapper[4825]: W1007 19:35:24.158105 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a978ccd_af77_4892_9bae_0f87170eb4a1.slice/crio-143bc8681a7ccc4db39b274faf99d6941dd6d7c4c752991881b2644baa70c264 WatchSource:0}: Error finding container 143bc8681a7ccc4db39b274faf99d6941dd6d7c4c752991881b2644baa70c264: Status 404 returned error can't find the container with id 143bc8681a7ccc4db39b274faf99d6941dd6d7c4c752991881b2644baa70c264 Oct 07 19:35:24 crc kubenswrapper[4825]: I1007 19:35:24.526406 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x5kh7" Oct 07 19:35:24 crc kubenswrapper[4825]: I1007 19:35:24.628763 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a065512e-39ae-4763-b562-fb77245458ae-utilities\") pod \"a065512e-39ae-4763-b562-fb77245458ae\" (UID: \"a065512e-39ae-4763-b562-fb77245458ae\") " Oct 07 19:35:24 crc kubenswrapper[4825]: I1007 19:35:24.629181 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a065512e-39ae-4763-b562-fb77245458ae-catalog-content\") pod \"a065512e-39ae-4763-b562-fb77245458ae\" (UID: \"a065512e-39ae-4763-b562-fb77245458ae\") " Oct 07 19:35:24 crc kubenswrapper[4825]: I1007 19:35:24.629215 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9t4z\" (UniqueName: \"kubernetes.io/projected/a065512e-39ae-4763-b562-fb77245458ae-kube-api-access-b9t4z\") pod \"a065512e-39ae-4763-b562-fb77245458ae\" (UID: \"a065512e-39ae-4763-b562-fb77245458ae\") " Oct 07 19:35:24 crc kubenswrapper[4825]: I1007 19:35:24.630443 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a065512e-39ae-4763-b562-fb77245458ae-utilities" (OuterVolumeSpecName: "utilities") pod "a065512e-39ae-4763-b562-fb77245458ae" (UID: "a065512e-39ae-4763-b562-fb77245458ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:35:24 crc kubenswrapper[4825]: I1007 19:35:24.633435 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a065512e-39ae-4763-b562-fb77245458ae-kube-api-access-b9t4z" (OuterVolumeSpecName: "kube-api-access-b9t4z") pod "a065512e-39ae-4763-b562-fb77245458ae" (UID: "a065512e-39ae-4763-b562-fb77245458ae"). InnerVolumeSpecName "kube-api-access-b9t4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:35:24 crc kubenswrapper[4825]: I1007 19:35:24.676275 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a065512e-39ae-4763-b562-fb77245458ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a065512e-39ae-4763-b562-fb77245458ae" (UID: "a065512e-39ae-4763-b562-fb77245458ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:35:24 crc kubenswrapper[4825]: I1007 19:35:24.731440 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a065512e-39ae-4763-b562-fb77245458ae-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:35:24 crc kubenswrapper[4825]: I1007 19:35:24.731469 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a065512e-39ae-4763-b562-fb77245458ae-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:35:24 crc kubenswrapper[4825]: I1007 19:35:24.731479 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9t4z\" (UniqueName: \"kubernetes.io/projected/a065512e-39ae-4763-b562-fb77245458ae-kube-api-access-b9t4z\") on node \"crc\" DevicePath \"\"" Oct 07 19:35:25 crc kubenswrapper[4825]: I1007 19:35:25.070060 4825 generic.go:334] "Generic (PLEG): container finished" podID="a065512e-39ae-4763-b562-fb77245458ae" containerID="c0ec007de100f418707d4f285651c1493c8d067ba6a4b1e70c8244ea3ca4e6c6" exitCode=0 Oct 07 19:35:25 crc kubenswrapper[4825]: I1007 19:35:25.070146 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5kh7" event={"ID":"a065512e-39ae-4763-b562-fb77245458ae","Type":"ContainerDied","Data":"c0ec007de100f418707d4f285651c1493c8d067ba6a4b1e70c8244ea3ca4e6c6"} Oct 07 19:35:25 crc kubenswrapper[4825]: I1007 19:35:25.070180 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5kh7" event={"ID":"a065512e-39ae-4763-b562-fb77245458ae","Type":"ContainerDied","Data":"959d7cfebca01fd0cacb4bd36367b930b274ab0c0704cba43b2576618c1c392a"} Oct 07 19:35:25 crc kubenswrapper[4825]: I1007 19:35:25.070185 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x5kh7" Oct 07 19:35:25 crc kubenswrapper[4825]: I1007 19:35:25.070201 4825 scope.go:117] "RemoveContainer" containerID="c0ec007de100f418707d4f285651c1493c8d067ba6a4b1e70c8244ea3ca4e6c6" Oct 07 19:35:25 crc kubenswrapper[4825]: I1007 19:35:25.080112 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw" event={"ID":"6a978ccd-af77-4892-9bae-0f87170eb4a1","Type":"ContainerStarted","Data":"ff90d3f1035de9addef94f7dedb739924d0e17bdbbe92d60e4dfd980248dd440"} Oct 07 19:35:25 crc kubenswrapper[4825]: I1007 19:35:25.080162 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw" event={"ID":"6a978ccd-af77-4892-9bae-0f87170eb4a1","Type":"ContainerStarted","Data":"143bc8681a7ccc4db39b274faf99d6941dd6d7c4c752991881b2644baa70c264"} Oct 07 19:35:25 crc kubenswrapper[4825]: I1007 19:35:25.114781 4825 scope.go:117] "RemoveContainer" containerID="b3be5ac1243eff8db21f5da76e4d94fb8066bb0c3b697ee7ab0fcbee235145fa" Oct 07 19:35:25 crc kubenswrapper[4825]: I1007 19:35:25.147509 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw" podStartSLOduration=1.9439574419999999 podStartE2EDuration="2.147486419s" podCreationTimestamp="2025-10-07 19:35:23 +0000 UTC" firstStartedPulling="2025-10-07 19:35:24.163620855 +0000 UTC m=+2112.985659502" lastFinishedPulling="2025-10-07 19:35:24.367149832 +0000 UTC m=+2113.189188479" observedRunningTime="2025-10-07 19:35:25.109586848 +0000 UTC m=+2113.931625485" watchObservedRunningTime="2025-10-07 19:35:25.147486419 +0000 UTC m=+2113.969525056" Oct 07 19:35:25 crc kubenswrapper[4825]: I1007 19:35:25.160849 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x5kh7"] Oct 07 19:35:25 crc kubenswrapper[4825]: I1007 19:35:25.163052 4825 scope.go:117] "RemoveContainer" containerID="572a89df4e560ee4849f481dec7884f3931b8fea08fc8b934add8b017a20cfa5" Oct 07 19:35:25 crc kubenswrapper[4825]: I1007 19:35:25.171020 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x5kh7"] Oct 07 19:35:25 crc kubenswrapper[4825]: I1007 19:35:25.195658 4825 scope.go:117] "RemoveContainer" containerID="c0ec007de100f418707d4f285651c1493c8d067ba6a4b1e70c8244ea3ca4e6c6" Oct 07 19:35:25 crc kubenswrapper[4825]: E1007 19:35:25.196183 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0ec007de100f418707d4f285651c1493c8d067ba6a4b1e70c8244ea3ca4e6c6\": container with ID starting with c0ec007de100f418707d4f285651c1493c8d067ba6a4b1e70c8244ea3ca4e6c6 not found: ID does not exist" containerID="c0ec007de100f418707d4f285651c1493c8d067ba6a4b1e70c8244ea3ca4e6c6" Oct 07 19:35:25 crc kubenswrapper[4825]: I1007 19:35:25.196244 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0ec007de100f418707d4f285651c1493c8d067ba6a4b1e70c8244ea3ca4e6c6"} err="failed to get container status \"c0ec007de100f418707d4f285651c1493c8d067ba6a4b1e70c8244ea3ca4e6c6\": rpc error: code = NotFound desc = could not find container \"c0ec007de100f418707d4f285651c1493c8d067ba6a4b1e70c8244ea3ca4e6c6\": container with ID starting with c0ec007de100f418707d4f285651c1493c8d067ba6a4b1e70c8244ea3ca4e6c6 not found: ID does not exist" Oct 07 19:35:25 crc kubenswrapper[4825]: I1007 19:35:25.196274 4825 scope.go:117] "RemoveContainer" containerID="b3be5ac1243eff8db21f5da76e4d94fb8066bb0c3b697ee7ab0fcbee235145fa" Oct 07 19:35:25 crc kubenswrapper[4825]: E1007 19:35:25.196762 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3be5ac1243eff8db21f5da76e4d94fb8066bb0c3b697ee7ab0fcbee235145fa\": container with ID starting with b3be5ac1243eff8db21f5da76e4d94fb8066bb0c3b697ee7ab0fcbee235145fa not found: ID does not exist" containerID="b3be5ac1243eff8db21f5da76e4d94fb8066bb0c3b697ee7ab0fcbee235145fa" Oct 07 19:35:25 crc kubenswrapper[4825]: I1007 19:35:25.196804 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3be5ac1243eff8db21f5da76e4d94fb8066bb0c3b697ee7ab0fcbee235145fa"} err="failed to get container status \"b3be5ac1243eff8db21f5da76e4d94fb8066bb0c3b697ee7ab0fcbee235145fa\": rpc error: code = NotFound desc = could not find container \"b3be5ac1243eff8db21f5da76e4d94fb8066bb0c3b697ee7ab0fcbee235145fa\": container with ID starting with b3be5ac1243eff8db21f5da76e4d94fb8066bb0c3b697ee7ab0fcbee235145fa not found: ID does not exist" Oct 07 19:35:25 crc kubenswrapper[4825]: I1007 19:35:25.196833 4825 scope.go:117] "RemoveContainer" containerID="572a89df4e560ee4849f481dec7884f3931b8fea08fc8b934add8b017a20cfa5" Oct 07 19:35:25 crc kubenswrapper[4825]: E1007 19:35:25.197168 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"572a89df4e560ee4849f481dec7884f3931b8fea08fc8b934add8b017a20cfa5\": container with ID starting with 572a89df4e560ee4849f481dec7884f3931b8fea08fc8b934add8b017a20cfa5 not found: ID does not exist" containerID="572a89df4e560ee4849f481dec7884f3931b8fea08fc8b934add8b017a20cfa5" Oct 07 19:35:25 crc kubenswrapper[4825]: I1007 19:35:25.197219 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"572a89df4e560ee4849f481dec7884f3931b8fea08fc8b934add8b017a20cfa5"} err="failed to get container status \"572a89df4e560ee4849f481dec7884f3931b8fea08fc8b934add8b017a20cfa5\": rpc error: code = NotFound desc = could not find container \"572a89df4e560ee4849f481dec7884f3931b8fea08fc8b934add8b017a20cfa5\": container with ID starting with 572a89df4e560ee4849f481dec7884f3931b8fea08fc8b934add8b017a20cfa5 not found: ID does not exist" Oct 07 19:35:25 crc kubenswrapper[4825]: I1007 19:35:25.817785 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a065512e-39ae-4763-b562-fb77245458ae" path="/var/lib/kubelet/pods/a065512e-39ae-4763-b562-fb77245458ae/volumes" Oct 07 19:35:35 crc kubenswrapper[4825]: I1007 19:35:35.708637 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:35:35 crc kubenswrapper[4825]: I1007 19:35:35.709458 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:36:05 crc kubenswrapper[4825]: I1007 19:36:05.708826 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:36:05 crc kubenswrapper[4825]: I1007 19:36:05.709704 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:36:05 crc kubenswrapper[4825]: I1007 19:36:05.709798 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:36:05 crc kubenswrapper[4825]: I1007 19:36:05.711049 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7dc751bc7deb95ed4969acee0ba339cabe0592b5e0342dcfba004125f3c9f015"} pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 19:36:05 crc kubenswrapper[4825]: I1007 19:36:05.711194 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" containerID="cri-o://7dc751bc7deb95ed4969acee0ba339cabe0592b5e0342dcfba004125f3c9f015" gracePeriod=600 Oct 07 19:36:06 crc kubenswrapper[4825]: I1007 19:36:06.558221 4825 generic.go:334] "Generic (PLEG): container finished" podID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerID="7dc751bc7deb95ed4969acee0ba339cabe0592b5e0342dcfba004125f3c9f015" exitCode=0 Oct 07 19:36:06 crc kubenswrapper[4825]: I1007 19:36:06.558299 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerDied","Data":"7dc751bc7deb95ed4969acee0ba339cabe0592b5e0342dcfba004125f3c9f015"} Oct 07 19:36:06 crc kubenswrapper[4825]: I1007 19:36:06.558760 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerStarted","Data":"4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d"} Oct 07 19:36:06 crc kubenswrapper[4825]: I1007 19:36:06.558786 4825 scope.go:117] "RemoveContainer" containerID="5f6c856951aaecb888c395add6fcd2a53bf05584b1a70c3f0e723fd2d6dce677" Oct 07 19:38:35 crc kubenswrapper[4825]: I1007 19:38:35.708694 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:38:35 crc kubenswrapper[4825]: I1007 19:38:35.709372 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:39:05 crc kubenswrapper[4825]: I1007 19:39:05.709943 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:39:05 crc kubenswrapper[4825]: I1007 19:39:05.711083 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:39:31 crc kubenswrapper[4825]: I1007 19:39:31.973670 4825 generic.go:334] "Generic (PLEG): container finished" podID="6a978ccd-af77-4892-9bae-0f87170eb4a1" containerID="ff90d3f1035de9addef94f7dedb739924d0e17bdbbe92d60e4dfd980248dd440" exitCode=0 Oct 07 19:39:31 crc kubenswrapper[4825]: I1007 19:39:31.974173 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw" event={"ID":"6a978ccd-af77-4892-9bae-0f87170eb4a1","Type":"ContainerDied","Data":"ff90d3f1035de9addef94f7dedb739924d0e17bdbbe92d60e4dfd980248dd440"} Oct 07 19:39:33 crc kubenswrapper[4825]: I1007 19:39:33.410748 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw" Oct 07 19:39:33 crc kubenswrapper[4825]: I1007 19:39:33.536365 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a978ccd-af77-4892-9bae-0f87170eb4a1-ssh-key\") pod \"6a978ccd-af77-4892-9bae-0f87170eb4a1\" (UID: \"6a978ccd-af77-4892-9bae-0f87170eb4a1\") " Oct 07 19:39:33 crc kubenswrapper[4825]: I1007 19:39:33.536418 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6a978ccd-af77-4892-9bae-0f87170eb4a1-libvirt-secret-0\") pod \"6a978ccd-af77-4892-9bae-0f87170eb4a1\" (UID: \"6a978ccd-af77-4892-9bae-0f87170eb4a1\") " Oct 07 19:39:33 crc kubenswrapper[4825]: I1007 19:39:33.536442 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a978ccd-af77-4892-9bae-0f87170eb4a1-inventory\") pod \"6a978ccd-af77-4892-9bae-0f87170eb4a1\" (UID: \"6a978ccd-af77-4892-9bae-0f87170eb4a1\") " Oct 07 19:39:33 crc kubenswrapper[4825]: I1007 19:39:33.536596 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a978ccd-af77-4892-9bae-0f87170eb4a1-libvirt-combined-ca-bundle\") pod \"6a978ccd-af77-4892-9bae-0f87170eb4a1\" (UID: \"6a978ccd-af77-4892-9bae-0f87170eb4a1\") " Oct 07 19:39:33 crc kubenswrapper[4825]: I1007 19:39:33.536669 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwkx9\" (UniqueName: \"kubernetes.io/projected/6a978ccd-af77-4892-9bae-0f87170eb4a1-kube-api-access-qwkx9\") pod \"6a978ccd-af77-4892-9bae-0f87170eb4a1\" (UID: \"6a978ccd-af77-4892-9bae-0f87170eb4a1\") " Oct 07 19:39:33 crc kubenswrapper[4825]: I1007 19:39:33.542871 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a978ccd-af77-4892-9bae-0f87170eb4a1-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "6a978ccd-af77-4892-9bae-0f87170eb4a1" (UID: "6a978ccd-af77-4892-9bae-0f87170eb4a1"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:39:33 crc kubenswrapper[4825]: I1007 19:39:33.543133 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a978ccd-af77-4892-9bae-0f87170eb4a1-kube-api-access-qwkx9" (OuterVolumeSpecName: "kube-api-access-qwkx9") pod "6a978ccd-af77-4892-9bae-0f87170eb4a1" (UID: "6a978ccd-af77-4892-9bae-0f87170eb4a1"). InnerVolumeSpecName "kube-api-access-qwkx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:39:33 crc kubenswrapper[4825]: I1007 19:39:33.564262 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a978ccd-af77-4892-9bae-0f87170eb4a1-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "6a978ccd-af77-4892-9bae-0f87170eb4a1" (UID: "6a978ccd-af77-4892-9bae-0f87170eb4a1"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:39:33 crc kubenswrapper[4825]: I1007 19:39:33.572165 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a978ccd-af77-4892-9bae-0f87170eb4a1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6a978ccd-af77-4892-9bae-0f87170eb4a1" (UID: "6a978ccd-af77-4892-9bae-0f87170eb4a1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:39:33 crc kubenswrapper[4825]: I1007 19:39:33.573458 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a978ccd-af77-4892-9bae-0f87170eb4a1-inventory" (OuterVolumeSpecName: "inventory") pod "6a978ccd-af77-4892-9bae-0f87170eb4a1" (UID: "6a978ccd-af77-4892-9bae-0f87170eb4a1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:39:33 crc kubenswrapper[4825]: I1007 19:39:33.638589 4825 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a978ccd-af77-4892-9bae-0f87170eb4a1-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:39:33 crc kubenswrapper[4825]: I1007 19:39:33.638630 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwkx9\" (UniqueName: \"kubernetes.io/projected/6a978ccd-af77-4892-9bae-0f87170eb4a1-kube-api-access-qwkx9\") on node \"crc\" DevicePath \"\"" Oct 07 19:39:33 crc kubenswrapper[4825]: I1007 19:39:33.638646 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a978ccd-af77-4892-9bae-0f87170eb4a1-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 19:39:33 crc kubenswrapper[4825]: I1007 19:39:33.638663 4825 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6a978ccd-af77-4892-9bae-0f87170eb4a1-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 07 19:39:33 crc kubenswrapper[4825]: I1007 19:39:33.638678 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a978ccd-af77-4892-9bae-0f87170eb4a1-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 19:39:33 crc kubenswrapper[4825]: I1007 19:39:33.998572 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw" event={"ID":"6a978ccd-af77-4892-9bae-0f87170eb4a1","Type":"ContainerDied","Data":"143bc8681a7ccc4db39b274faf99d6941dd6d7c4c752991881b2644baa70c264"} Oct 07 19:39:33 crc kubenswrapper[4825]: I1007 19:39:33.998881 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="143bc8681a7ccc4db39b274faf99d6941dd6d7c4c752991881b2644baa70c264" Oct 07 19:39:33 crc kubenswrapper[4825]: I1007 19:39:33.998668 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.099418 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl"] Oct 07 19:39:34 crc kubenswrapper[4825]: E1007 19:39:34.100112 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a978ccd-af77-4892-9bae-0f87170eb4a1" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.100215 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a978ccd-af77-4892-9bae-0f87170eb4a1" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 07 19:39:34 crc kubenswrapper[4825]: E1007 19:39:34.100350 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a065512e-39ae-4763-b562-fb77245458ae" containerName="registry-server" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.100422 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a065512e-39ae-4763-b562-fb77245458ae" containerName="registry-server" Oct 07 19:39:34 crc kubenswrapper[4825]: E1007 19:39:34.100520 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a065512e-39ae-4763-b562-fb77245458ae" containerName="extract-content" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.100634 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a065512e-39ae-4763-b562-fb77245458ae" containerName="extract-content" Oct 07 19:39:34 crc kubenswrapper[4825]: E1007 19:39:34.100734 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a065512e-39ae-4763-b562-fb77245458ae" containerName="extract-utilities" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.100818 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a065512e-39ae-4763-b562-fb77245458ae" containerName="extract-utilities" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.101103 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a978ccd-af77-4892-9bae-0f87170eb4a1" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.101206 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a065512e-39ae-4763-b562-fb77245458ae" containerName="registry-server" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.102187 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.104028 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.106969 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.107078 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.107113 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lr8sm" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.107171 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.107390 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.110517 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.119278 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl"] Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.158353 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjf4j\" (UniqueName: \"kubernetes.io/projected/3c1f45e7-330e-4c79-8609-2988aac67b05-kube-api-access-tjf4j\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.158420 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.158482 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.158562 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.158592 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.158620 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.158660 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.158704 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.158764 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.260609 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.260684 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.260935 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.260973 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.261007 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.261073 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.261106 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjf4j\" (UniqueName: \"kubernetes.io/projected/3c1f45e7-330e-4c79-8609-2988aac67b05-kube-api-access-tjf4j\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.261138 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.261191 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.264344 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.267121 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.269139 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.271225 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.271541 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.276311 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.283542 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.283680 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.292555 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjf4j\" (UniqueName: \"kubernetes.io/projected/3c1f45e7-330e-4c79-8609-2988aac67b05-kube-api-access-tjf4j\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l9nwl\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.418326 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.974338 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl"] Oct 07 19:39:34 crc kubenswrapper[4825]: I1007 19:39:34.979166 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 19:39:35 crc kubenswrapper[4825]: I1007 19:39:35.016461 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" event={"ID":"3c1f45e7-330e-4c79-8609-2988aac67b05","Type":"ContainerStarted","Data":"fd7891459854931247176a8d1dcca5560223865355f5c123013e0ce9a84b9237"} Oct 07 19:39:35 crc kubenswrapper[4825]: I1007 19:39:35.708685 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:39:35 crc kubenswrapper[4825]: I1007 19:39:35.708997 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:39:35 crc kubenswrapper[4825]: I1007 19:39:35.709052 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:39:35 crc kubenswrapper[4825]: I1007 19:39:35.709998 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d"} pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 19:39:35 crc kubenswrapper[4825]: I1007 19:39:35.710070 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" containerID="cri-o://4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" gracePeriod=600 Oct 07 19:39:35 crc kubenswrapper[4825]: E1007 19:39:35.849854 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:39:36 crc kubenswrapper[4825]: I1007 19:39:36.030971 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" event={"ID":"3c1f45e7-330e-4c79-8609-2988aac67b05","Type":"ContainerStarted","Data":"dc6a95d1213fb2f2a5f8c209e16d2f91a3bc5cbec300eec59a2f9ee94e09eab7"} Oct 07 19:39:36 crc kubenswrapper[4825]: I1007 19:39:36.035128 4825 generic.go:334] "Generic (PLEG): container finished" podID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" exitCode=0 Oct 07 19:39:36 crc kubenswrapper[4825]: I1007 19:39:36.035191 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerDied","Data":"4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d"} Oct 07 19:39:36 crc kubenswrapper[4825]: I1007 19:39:36.035333 4825 scope.go:117] "RemoveContainer" containerID="7dc751bc7deb95ed4969acee0ba339cabe0592b5e0342dcfba004125f3c9f015" Oct 07 19:39:36 crc kubenswrapper[4825]: I1007 19:39:36.035853 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:39:36 crc kubenswrapper[4825]: E1007 19:39:36.036380 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:39:36 crc kubenswrapper[4825]: I1007 19:39:36.057120 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" podStartSLOduration=1.7785598550000001 podStartE2EDuration="2.057099432s" podCreationTimestamp="2025-10-07 19:39:34 +0000 UTC" firstStartedPulling="2025-10-07 19:39:34.978956564 +0000 UTC m=+2363.800995201" lastFinishedPulling="2025-10-07 19:39:35.257496141 +0000 UTC m=+2364.079534778" observedRunningTime="2025-10-07 19:39:36.048595341 +0000 UTC m=+2364.870633988" watchObservedRunningTime="2025-10-07 19:39:36.057099432 +0000 UTC m=+2364.879138069" Oct 07 19:39:50 crc kubenswrapper[4825]: I1007 19:39:50.794917 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:39:50 crc kubenswrapper[4825]: E1007 19:39:50.795737 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:40:02 crc kubenswrapper[4825]: I1007 19:40:02.795961 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:40:02 crc kubenswrapper[4825]: E1007 19:40:02.796911 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:40:14 crc kubenswrapper[4825]: I1007 19:40:14.794852 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:40:14 crc kubenswrapper[4825]: E1007 19:40:14.795477 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:40:27 crc kubenswrapper[4825]: I1007 19:40:27.796075 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:40:27 crc kubenswrapper[4825]: E1007 19:40:27.796891 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:40:41 crc kubenswrapper[4825]: I1007 19:40:41.822665 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:40:41 crc kubenswrapper[4825]: E1007 19:40:41.824011 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:40:55 crc kubenswrapper[4825]: I1007 19:40:55.795656 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:40:55 crc kubenswrapper[4825]: E1007 19:40:55.796834 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:41:09 crc kubenswrapper[4825]: I1007 19:41:09.796683 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:41:09 crc kubenswrapper[4825]: E1007 19:41:09.798895 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:41:23 crc kubenswrapper[4825]: I1007 19:41:23.795253 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:41:23 crc kubenswrapper[4825]: E1007 19:41:23.797448 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:41:35 crc kubenswrapper[4825]: I1007 19:41:35.796093 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:41:35 crc kubenswrapper[4825]: E1007 19:41:35.796784 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:41:43 crc kubenswrapper[4825]: I1007 19:41:43.571708 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gqj9f"] Oct 07 19:41:43 crc kubenswrapper[4825]: I1007 19:41:43.574768 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqj9f" Oct 07 19:41:43 crc kubenswrapper[4825]: I1007 19:41:43.585088 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gqj9f"] Oct 07 19:41:43 crc kubenswrapper[4825]: I1007 19:41:43.729988 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrfnt\" (UniqueName: \"kubernetes.io/projected/24947946-3a68-4be5-8158-5a4cdc44cedb-kube-api-access-mrfnt\") pod \"redhat-operators-gqj9f\" (UID: \"24947946-3a68-4be5-8158-5a4cdc44cedb\") " pod="openshift-marketplace/redhat-operators-gqj9f" Oct 07 19:41:43 crc kubenswrapper[4825]: I1007 19:41:43.730056 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24947946-3a68-4be5-8158-5a4cdc44cedb-utilities\") pod \"redhat-operators-gqj9f\" (UID: \"24947946-3a68-4be5-8158-5a4cdc44cedb\") " pod="openshift-marketplace/redhat-operators-gqj9f" Oct 07 19:41:43 crc kubenswrapper[4825]: I1007 19:41:43.730137 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24947946-3a68-4be5-8158-5a4cdc44cedb-catalog-content\") pod \"redhat-operators-gqj9f\" (UID: \"24947946-3a68-4be5-8158-5a4cdc44cedb\") " pod="openshift-marketplace/redhat-operators-gqj9f" Oct 07 19:41:43 crc kubenswrapper[4825]: I1007 19:41:43.831340 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrfnt\" (UniqueName: \"kubernetes.io/projected/24947946-3a68-4be5-8158-5a4cdc44cedb-kube-api-access-mrfnt\") pod \"redhat-operators-gqj9f\" (UID: \"24947946-3a68-4be5-8158-5a4cdc44cedb\") " pod="openshift-marketplace/redhat-operators-gqj9f" Oct 07 19:41:43 crc kubenswrapper[4825]: I1007 19:41:43.831575 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24947946-3a68-4be5-8158-5a4cdc44cedb-utilities\") pod \"redhat-operators-gqj9f\" (UID: \"24947946-3a68-4be5-8158-5a4cdc44cedb\") " pod="openshift-marketplace/redhat-operators-gqj9f" Oct 07 19:41:43 crc kubenswrapper[4825]: I1007 19:41:43.831740 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24947946-3a68-4be5-8158-5a4cdc44cedb-catalog-content\") pod \"redhat-operators-gqj9f\" (UID: \"24947946-3a68-4be5-8158-5a4cdc44cedb\") " pod="openshift-marketplace/redhat-operators-gqj9f" Oct 07 19:41:43 crc kubenswrapper[4825]: I1007 19:41:43.832399 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24947946-3a68-4be5-8158-5a4cdc44cedb-catalog-content\") pod \"redhat-operators-gqj9f\" (UID: \"24947946-3a68-4be5-8158-5a4cdc44cedb\") " pod="openshift-marketplace/redhat-operators-gqj9f" Oct 07 19:41:43 crc kubenswrapper[4825]: I1007 19:41:43.832810 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24947946-3a68-4be5-8158-5a4cdc44cedb-utilities\") pod \"redhat-operators-gqj9f\" (UID: \"24947946-3a68-4be5-8158-5a4cdc44cedb\") " pod="openshift-marketplace/redhat-operators-gqj9f" Oct 07 19:41:43 crc kubenswrapper[4825]: I1007 19:41:43.849708 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrfnt\" (UniqueName: \"kubernetes.io/projected/24947946-3a68-4be5-8158-5a4cdc44cedb-kube-api-access-mrfnt\") pod \"redhat-operators-gqj9f\" (UID: \"24947946-3a68-4be5-8158-5a4cdc44cedb\") " pod="openshift-marketplace/redhat-operators-gqj9f" Oct 07 19:41:43 crc kubenswrapper[4825]: I1007 19:41:43.956807 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqj9f" Oct 07 19:41:44 crc kubenswrapper[4825]: I1007 19:41:44.414833 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gqj9f"] Oct 07 19:41:44 crc kubenswrapper[4825]: I1007 19:41:44.585957 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqj9f" event={"ID":"24947946-3a68-4be5-8158-5a4cdc44cedb","Type":"ContainerStarted","Data":"78c4f7a6293556b743afc496cc993df28dbfba1fa964a3852beb3ddc401e5cf6"} Oct 07 19:41:45 crc kubenswrapper[4825]: I1007 19:41:45.598333 4825 generic.go:334] "Generic (PLEG): container finished" podID="24947946-3a68-4be5-8158-5a4cdc44cedb" containerID="c0d7b363d103be35d0c3de86cc78c1d5d01aaf47cf56676a78817bcd630a70fb" exitCode=0 Oct 07 19:41:45 crc kubenswrapper[4825]: I1007 19:41:45.598410 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqj9f" event={"ID":"24947946-3a68-4be5-8158-5a4cdc44cedb","Type":"ContainerDied","Data":"c0d7b363d103be35d0c3de86cc78c1d5d01aaf47cf56676a78817bcd630a70fb"} Oct 07 19:41:46 crc kubenswrapper[4825]: I1007 19:41:46.795188 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:41:46 crc kubenswrapper[4825]: E1007 19:41:46.795954 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:41:47 crc kubenswrapper[4825]: I1007 19:41:47.621770 4825 generic.go:334] "Generic (PLEG): container finished" podID="24947946-3a68-4be5-8158-5a4cdc44cedb" containerID="e2f3b1f4f0c9aa1ebd214fddb79f4d616a0518fea63c48acd33de72388e8dff4" exitCode=0 Oct 07 19:41:47 crc kubenswrapper[4825]: I1007 19:41:47.622070 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqj9f" event={"ID":"24947946-3a68-4be5-8158-5a4cdc44cedb","Type":"ContainerDied","Data":"e2f3b1f4f0c9aa1ebd214fddb79f4d616a0518fea63c48acd33de72388e8dff4"} Oct 07 19:41:48 crc kubenswrapper[4825]: I1007 19:41:48.632059 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqj9f" event={"ID":"24947946-3a68-4be5-8158-5a4cdc44cedb","Type":"ContainerStarted","Data":"e836518011e305306f358f8ccf7bb6d97bde46521e6767992aec0e93a6f878ad"} Oct 07 19:41:48 crc kubenswrapper[4825]: I1007 19:41:48.651707 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gqj9f" podStartSLOduration=3.17027899 podStartE2EDuration="5.651691456s" podCreationTimestamp="2025-10-07 19:41:43 +0000 UTC" firstStartedPulling="2025-10-07 19:41:45.602360676 +0000 UTC m=+2494.424399323" lastFinishedPulling="2025-10-07 19:41:48.083773152 +0000 UTC m=+2496.905811789" observedRunningTime="2025-10-07 19:41:48.645146783 +0000 UTC m=+2497.467185430" watchObservedRunningTime="2025-10-07 19:41:48.651691456 +0000 UTC m=+2497.473730093" Oct 07 19:41:53 crc kubenswrapper[4825]: I1007 19:41:53.957854 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gqj9f" Oct 07 19:41:53 crc kubenswrapper[4825]: I1007 19:41:53.958589 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gqj9f" Oct 07 19:41:54 crc kubenswrapper[4825]: I1007 19:41:54.043526 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gqj9f" Oct 07 19:41:54 crc kubenswrapper[4825]: I1007 19:41:54.754399 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gqj9f" Oct 07 19:41:54 crc kubenswrapper[4825]: I1007 19:41:54.838751 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gqj9f"] Oct 07 19:41:56 crc kubenswrapper[4825]: I1007 19:41:56.710145 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gqj9f" podUID="24947946-3a68-4be5-8158-5a4cdc44cedb" containerName="registry-server" containerID="cri-o://e836518011e305306f358f8ccf7bb6d97bde46521e6767992aec0e93a6f878ad" gracePeriod=2 Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.162043 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqj9f" Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.322515 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24947946-3a68-4be5-8158-5a4cdc44cedb-catalog-content\") pod \"24947946-3a68-4be5-8158-5a4cdc44cedb\" (UID: \"24947946-3a68-4be5-8158-5a4cdc44cedb\") " Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.322704 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24947946-3a68-4be5-8158-5a4cdc44cedb-utilities\") pod \"24947946-3a68-4be5-8158-5a4cdc44cedb\" (UID: \"24947946-3a68-4be5-8158-5a4cdc44cedb\") " Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.322793 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrfnt\" (UniqueName: \"kubernetes.io/projected/24947946-3a68-4be5-8158-5a4cdc44cedb-kube-api-access-mrfnt\") pod \"24947946-3a68-4be5-8158-5a4cdc44cedb\" (UID: \"24947946-3a68-4be5-8158-5a4cdc44cedb\") " Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.323732 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24947946-3a68-4be5-8158-5a4cdc44cedb-utilities" (OuterVolumeSpecName: "utilities") pod "24947946-3a68-4be5-8158-5a4cdc44cedb" (UID: "24947946-3a68-4be5-8158-5a4cdc44cedb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.336367 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24947946-3a68-4be5-8158-5a4cdc44cedb-kube-api-access-mrfnt" (OuterVolumeSpecName: "kube-api-access-mrfnt") pod "24947946-3a68-4be5-8158-5a4cdc44cedb" (UID: "24947946-3a68-4be5-8158-5a4cdc44cedb"). InnerVolumeSpecName "kube-api-access-mrfnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.424634 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24947946-3a68-4be5-8158-5a4cdc44cedb-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.424700 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrfnt\" (UniqueName: \"kubernetes.io/projected/24947946-3a68-4be5-8158-5a4cdc44cedb-kube-api-access-mrfnt\") on node \"crc\" DevicePath \"\"" Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.436151 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24947946-3a68-4be5-8158-5a4cdc44cedb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24947946-3a68-4be5-8158-5a4cdc44cedb" (UID: "24947946-3a68-4be5-8158-5a4cdc44cedb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.526315 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24947946-3a68-4be5-8158-5a4cdc44cedb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.721697 4825 generic.go:334] "Generic (PLEG): container finished" podID="24947946-3a68-4be5-8158-5a4cdc44cedb" containerID="e836518011e305306f358f8ccf7bb6d97bde46521e6767992aec0e93a6f878ad" exitCode=0 Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.721751 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqj9f" event={"ID":"24947946-3a68-4be5-8158-5a4cdc44cedb","Type":"ContainerDied","Data":"e836518011e305306f358f8ccf7bb6d97bde46521e6767992aec0e93a6f878ad"} Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.721784 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqj9f" Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.722212 4825 scope.go:117] "RemoveContainer" containerID="e836518011e305306f358f8ccf7bb6d97bde46521e6767992aec0e93a6f878ad" Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.722062 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqj9f" event={"ID":"24947946-3a68-4be5-8158-5a4cdc44cedb","Type":"ContainerDied","Data":"78c4f7a6293556b743afc496cc993df28dbfba1fa964a3852beb3ddc401e5cf6"} Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.757540 4825 scope.go:117] "RemoveContainer" containerID="e2f3b1f4f0c9aa1ebd214fddb79f4d616a0518fea63c48acd33de72388e8dff4" Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.771258 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gqj9f"] Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.784041 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gqj9f"] Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.795338 4825 scope.go:117] "RemoveContainer" containerID="c0d7b363d103be35d0c3de86cc78c1d5d01aaf47cf56676a78817bcd630a70fb" Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.816785 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24947946-3a68-4be5-8158-5a4cdc44cedb" path="/var/lib/kubelet/pods/24947946-3a68-4be5-8158-5a4cdc44cedb/volumes" Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.843881 4825 scope.go:117] "RemoveContainer" containerID="e836518011e305306f358f8ccf7bb6d97bde46521e6767992aec0e93a6f878ad" Oct 07 19:41:57 crc kubenswrapper[4825]: E1007 19:41:57.844627 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e836518011e305306f358f8ccf7bb6d97bde46521e6767992aec0e93a6f878ad\": container with ID starting with e836518011e305306f358f8ccf7bb6d97bde46521e6767992aec0e93a6f878ad not found: ID does not exist" containerID="e836518011e305306f358f8ccf7bb6d97bde46521e6767992aec0e93a6f878ad" Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.844679 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e836518011e305306f358f8ccf7bb6d97bde46521e6767992aec0e93a6f878ad"} err="failed to get container status \"e836518011e305306f358f8ccf7bb6d97bde46521e6767992aec0e93a6f878ad\": rpc error: code = NotFound desc = could not find container \"e836518011e305306f358f8ccf7bb6d97bde46521e6767992aec0e93a6f878ad\": container with ID starting with e836518011e305306f358f8ccf7bb6d97bde46521e6767992aec0e93a6f878ad not found: ID does not exist" Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.844706 4825 scope.go:117] "RemoveContainer" containerID="e2f3b1f4f0c9aa1ebd214fddb79f4d616a0518fea63c48acd33de72388e8dff4" Oct 07 19:41:57 crc kubenswrapper[4825]: E1007 19:41:57.845235 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2f3b1f4f0c9aa1ebd214fddb79f4d616a0518fea63c48acd33de72388e8dff4\": container with ID starting with e2f3b1f4f0c9aa1ebd214fddb79f4d616a0518fea63c48acd33de72388e8dff4 not found: ID does not exist" containerID="e2f3b1f4f0c9aa1ebd214fddb79f4d616a0518fea63c48acd33de72388e8dff4" Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.845266 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2f3b1f4f0c9aa1ebd214fddb79f4d616a0518fea63c48acd33de72388e8dff4"} err="failed to get container status \"e2f3b1f4f0c9aa1ebd214fddb79f4d616a0518fea63c48acd33de72388e8dff4\": rpc error: code = NotFound desc = could not find container \"e2f3b1f4f0c9aa1ebd214fddb79f4d616a0518fea63c48acd33de72388e8dff4\": container with ID starting with e2f3b1f4f0c9aa1ebd214fddb79f4d616a0518fea63c48acd33de72388e8dff4 not found: ID does not exist" Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.845288 4825 scope.go:117] "RemoveContainer" containerID="c0d7b363d103be35d0c3de86cc78c1d5d01aaf47cf56676a78817bcd630a70fb" Oct 07 19:41:57 crc kubenswrapper[4825]: E1007 19:41:57.845893 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0d7b363d103be35d0c3de86cc78c1d5d01aaf47cf56676a78817bcd630a70fb\": container with ID starting with c0d7b363d103be35d0c3de86cc78c1d5d01aaf47cf56676a78817bcd630a70fb not found: ID does not exist" containerID="c0d7b363d103be35d0c3de86cc78c1d5d01aaf47cf56676a78817bcd630a70fb" Oct 07 19:41:57 crc kubenswrapper[4825]: I1007 19:41:57.845952 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0d7b363d103be35d0c3de86cc78c1d5d01aaf47cf56676a78817bcd630a70fb"} err="failed to get container status \"c0d7b363d103be35d0c3de86cc78c1d5d01aaf47cf56676a78817bcd630a70fb\": rpc error: code = NotFound desc = could not find container \"c0d7b363d103be35d0c3de86cc78c1d5d01aaf47cf56676a78817bcd630a70fb\": container with ID starting with c0d7b363d103be35d0c3de86cc78c1d5d01aaf47cf56676a78817bcd630a70fb not found: ID does not exist" Oct 07 19:42:01 crc kubenswrapper[4825]: I1007 19:42:01.811200 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:42:01 crc kubenswrapper[4825]: E1007 19:42:01.812444 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:42:16 crc kubenswrapper[4825]: I1007 19:42:16.795130 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:42:16 crc kubenswrapper[4825]: E1007 19:42:16.796032 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:42:27 crc kubenswrapper[4825]: I1007 19:42:27.795715 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:42:27 crc kubenswrapper[4825]: E1007 19:42:27.796663 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:42:39 crc kubenswrapper[4825]: I1007 19:42:39.796597 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:42:39 crc kubenswrapper[4825]: E1007 19:42:39.797739 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:42:54 crc kubenswrapper[4825]: I1007 19:42:54.796528 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:42:54 crc kubenswrapper[4825]: E1007 19:42:54.799640 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:42:59 crc kubenswrapper[4825]: I1007 19:42:59.404104 4825 generic.go:334] "Generic (PLEG): container finished" podID="3c1f45e7-330e-4c79-8609-2988aac67b05" containerID="dc6a95d1213fb2f2a5f8c209e16d2f91a3bc5cbec300eec59a2f9ee94e09eab7" exitCode=0 Oct 07 19:42:59 crc kubenswrapper[4825]: I1007 19:42:59.404213 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" event={"ID":"3c1f45e7-330e-4c79-8609-2988aac67b05","Type":"ContainerDied","Data":"dc6a95d1213fb2f2a5f8c209e16d2f91a3bc5cbec300eec59a2f9ee94e09eab7"} Oct 07 19:43:00 crc kubenswrapper[4825]: I1007 19:43:00.916532 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.099211 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-ssh-key\") pod \"3c1f45e7-330e-4c79-8609-2988aac67b05\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.099489 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-cell1-compute-config-1\") pod \"3c1f45e7-330e-4c79-8609-2988aac67b05\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.099509 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-migration-ssh-key-0\") pod \"3c1f45e7-330e-4c79-8609-2988aac67b05\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.099545 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-cell1-compute-config-0\") pod \"3c1f45e7-330e-4c79-8609-2988aac67b05\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.099648 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjf4j\" (UniqueName: \"kubernetes.io/projected/3c1f45e7-330e-4c79-8609-2988aac67b05-kube-api-access-tjf4j\") pod \"3c1f45e7-330e-4c79-8609-2988aac67b05\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.099677 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-inventory\") pod \"3c1f45e7-330e-4c79-8609-2988aac67b05\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.099746 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-extra-config-0\") pod \"3c1f45e7-330e-4c79-8609-2988aac67b05\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.099828 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-combined-ca-bundle\") pod \"3c1f45e7-330e-4c79-8609-2988aac67b05\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.099887 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-migration-ssh-key-1\") pod \"3c1f45e7-330e-4c79-8609-2988aac67b05\" (UID: \"3c1f45e7-330e-4c79-8609-2988aac67b05\") " Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.107438 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c1f45e7-330e-4c79-8609-2988aac67b05-kube-api-access-tjf4j" (OuterVolumeSpecName: "kube-api-access-tjf4j") pod "3c1f45e7-330e-4c79-8609-2988aac67b05" (UID: "3c1f45e7-330e-4c79-8609-2988aac67b05"). InnerVolumeSpecName "kube-api-access-tjf4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.107837 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "3c1f45e7-330e-4c79-8609-2988aac67b05" (UID: "3c1f45e7-330e-4c79-8609-2988aac67b05"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.128954 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3c1f45e7-330e-4c79-8609-2988aac67b05" (UID: "3c1f45e7-330e-4c79-8609-2988aac67b05"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.130177 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-inventory" (OuterVolumeSpecName: "inventory") pod "3c1f45e7-330e-4c79-8609-2988aac67b05" (UID: "3c1f45e7-330e-4c79-8609-2988aac67b05"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.131762 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "3c1f45e7-330e-4c79-8609-2988aac67b05" (UID: "3c1f45e7-330e-4c79-8609-2988aac67b05"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.135549 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "3c1f45e7-330e-4c79-8609-2988aac67b05" (UID: "3c1f45e7-330e-4c79-8609-2988aac67b05"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.138347 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "3c1f45e7-330e-4c79-8609-2988aac67b05" (UID: "3c1f45e7-330e-4c79-8609-2988aac67b05"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.142217 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "3c1f45e7-330e-4c79-8609-2988aac67b05" (UID: "3c1f45e7-330e-4c79-8609-2988aac67b05"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.152909 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "3c1f45e7-330e-4c79-8609-2988aac67b05" (UID: "3c1f45e7-330e-4c79-8609-2988aac67b05"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.202567 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjf4j\" (UniqueName: \"kubernetes.io/projected/3c1f45e7-330e-4c79-8609-2988aac67b05-kube-api-access-tjf4j\") on node \"crc\" DevicePath \"\"" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.202786 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.202897 4825 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.202975 4825 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.203059 4825 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.203147 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.203219 4825 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.203325 4825 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.203399 4825 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3c1f45e7-330e-4c79-8609-2988aac67b05-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.423390 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" event={"ID":"3c1f45e7-330e-4c79-8609-2988aac67b05","Type":"ContainerDied","Data":"fd7891459854931247176a8d1dcca5560223865355f5c123013e0ce9a84b9237"} Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.423446 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd7891459854931247176a8d1dcca5560223865355f5c123013e0ce9a84b9237" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.423522 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l9nwl" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.561292 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv"] Oct 07 19:43:01 crc kubenswrapper[4825]: E1007 19:43:01.561638 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24947946-3a68-4be5-8158-5a4cdc44cedb" containerName="extract-utilities" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.561654 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="24947946-3a68-4be5-8158-5a4cdc44cedb" containerName="extract-utilities" Oct 07 19:43:01 crc kubenswrapper[4825]: E1007 19:43:01.561668 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24947946-3a68-4be5-8158-5a4cdc44cedb" containerName="extract-content" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.561674 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="24947946-3a68-4be5-8158-5a4cdc44cedb" containerName="extract-content" Oct 07 19:43:01 crc kubenswrapper[4825]: E1007 19:43:01.561696 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24947946-3a68-4be5-8158-5a4cdc44cedb" containerName="registry-server" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.561702 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="24947946-3a68-4be5-8158-5a4cdc44cedb" containerName="registry-server" Oct 07 19:43:01 crc kubenswrapper[4825]: E1007 19:43:01.561722 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c1f45e7-330e-4c79-8609-2988aac67b05" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.561729 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c1f45e7-330e-4c79-8609-2988aac67b05" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.561901 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="24947946-3a68-4be5-8158-5a4cdc44cedb" containerName="registry-server" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.561928 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c1f45e7-330e-4c79-8609-2988aac67b05" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.562543 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.572745 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.573140 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.573677 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lr8sm" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.574397 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv"] Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.574585 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.574638 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.610191 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.610287 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnrgf\" (UniqueName: \"kubernetes.io/projected/803e0c1d-979b-47da-ba63-cad0323972a8-kube-api-access-wnrgf\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.610323 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.610359 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.610385 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.610405 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.610451 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.711730 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.711794 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.711838 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnrgf\" (UniqueName: \"kubernetes.io/projected/803e0c1d-979b-47da-ba63-cad0323972a8-kube-api-access-wnrgf\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.711870 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.711906 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.711932 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.711955 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.716809 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.717766 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.718144 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.719407 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.719652 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.726703 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.727065 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnrgf\" (UniqueName: \"kubernetes.io/projected/803e0c1d-979b-47da-ba63-cad0323972a8-kube-api-access-wnrgf\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:43:01 crc kubenswrapper[4825]: I1007 19:43:01.886825 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:43:02 crc kubenswrapper[4825]: I1007 19:43:02.425886 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv"] Oct 07 19:43:03 crc kubenswrapper[4825]: I1007 19:43:03.446979 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" event={"ID":"803e0c1d-979b-47da-ba63-cad0323972a8","Type":"ContainerStarted","Data":"e5a75b5f16c4d9e584c2d6f7d2166fb9c4265bd997dd129d0343520e43f24779"} Oct 07 19:43:03 crc kubenswrapper[4825]: I1007 19:43:03.447323 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" event={"ID":"803e0c1d-979b-47da-ba63-cad0323972a8","Type":"ContainerStarted","Data":"c977d4eb04e171e2a4492c15b234909ca2aee25d7ad6f805e4792be20d7d00d7"} Oct 07 19:43:07 crc kubenswrapper[4825]: I1007 19:43:07.795158 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:43:07 crc kubenswrapper[4825]: E1007 19:43:07.795956 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:43:22 crc kubenswrapper[4825]: I1007 19:43:22.795741 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:43:22 crc kubenswrapper[4825]: E1007 19:43:22.796852 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:43:33 crc kubenswrapper[4825]: I1007 19:43:33.795646 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:43:33 crc kubenswrapper[4825]: E1007 19:43:33.796965 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:43:46 crc kubenswrapper[4825]: I1007 19:43:46.795315 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:43:46 crc kubenswrapper[4825]: E1007 19:43:46.796128 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:44:01 crc kubenswrapper[4825]: I1007 19:44:01.807848 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:44:01 crc kubenswrapper[4825]: E1007 19:44:01.808953 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:44:15 crc kubenswrapper[4825]: I1007 19:44:15.796186 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:44:15 crc kubenswrapper[4825]: E1007 19:44:15.797044 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:44:29 crc kubenswrapper[4825]: I1007 19:44:29.795591 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:44:29 crc kubenswrapper[4825]: E1007 19:44:29.797555 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:44:44 crc kubenswrapper[4825]: I1007 19:44:44.796989 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:44:45 crc kubenswrapper[4825]: I1007 19:44:45.557525 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerStarted","Data":"57b89716c03c7599611d4dae6b8b92e7c7cf3c08e25e88a735fdb4005f3714e3"} Oct 07 19:44:45 crc kubenswrapper[4825]: I1007 19:44:45.597608 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" podStartSLOduration=104.387732815 podStartE2EDuration="1m44.597591091s" podCreationTimestamp="2025-10-07 19:43:01 +0000 UTC" firstStartedPulling="2025-10-07 19:43:02.425566948 +0000 UTC m=+2571.247605585" lastFinishedPulling="2025-10-07 19:43:02.635425224 +0000 UTC m=+2571.457463861" observedRunningTime="2025-10-07 19:43:03.47753584 +0000 UTC m=+2572.299574507" watchObservedRunningTime="2025-10-07 19:44:45.597591091 +0000 UTC m=+2674.419629728" Oct 07 19:44:57 crc kubenswrapper[4825]: I1007 19:44:57.160537 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-842c5"] Oct 07 19:44:57 crc kubenswrapper[4825]: I1007 19:44:57.168442 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-842c5" Oct 07 19:44:57 crc kubenswrapper[4825]: I1007 19:44:57.202611 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-842c5"] Oct 07 19:44:57 crc kubenswrapper[4825]: I1007 19:44:57.290280 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7de2029-f003-4278-9b43-c6a9b2246817-utilities\") pod \"redhat-marketplace-842c5\" (UID: \"b7de2029-f003-4278-9b43-c6a9b2246817\") " pod="openshift-marketplace/redhat-marketplace-842c5" Oct 07 19:44:57 crc kubenswrapper[4825]: I1007 19:44:57.290432 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8vkc\" (UniqueName: \"kubernetes.io/projected/b7de2029-f003-4278-9b43-c6a9b2246817-kube-api-access-t8vkc\") pod \"redhat-marketplace-842c5\" (UID: \"b7de2029-f003-4278-9b43-c6a9b2246817\") " pod="openshift-marketplace/redhat-marketplace-842c5" Oct 07 19:44:57 crc kubenswrapper[4825]: I1007 19:44:57.290508 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7de2029-f003-4278-9b43-c6a9b2246817-catalog-content\") pod \"redhat-marketplace-842c5\" (UID: \"b7de2029-f003-4278-9b43-c6a9b2246817\") " pod="openshift-marketplace/redhat-marketplace-842c5" Oct 07 19:44:57 crc kubenswrapper[4825]: I1007 19:44:57.392715 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7de2029-f003-4278-9b43-c6a9b2246817-utilities\") pod \"redhat-marketplace-842c5\" (UID: \"b7de2029-f003-4278-9b43-c6a9b2246817\") " pod="openshift-marketplace/redhat-marketplace-842c5" Oct 07 19:44:57 crc kubenswrapper[4825]: I1007 19:44:57.392833 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8vkc\" (UniqueName: \"kubernetes.io/projected/b7de2029-f003-4278-9b43-c6a9b2246817-kube-api-access-t8vkc\") pod \"redhat-marketplace-842c5\" (UID: \"b7de2029-f003-4278-9b43-c6a9b2246817\") " pod="openshift-marketplace/redhat-marketplace-842c5" Oct 07 19:44:57 crc kubenswrapper[4825]: I1007 19:44:57.392898 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7de2029-f003-4278-9b43-c6a9b2246817-catalog-content\") pod \"redhat-marketplace-842c5\" (UID: \"b7de2029-f003-4278-9b43-c6a9b2246817\") " pod="openshift-marketplace/redhat-marketplace-842c5" Oct 07 19:44:57 crc kubenswrapper[4825]: I1007 19:44:57.393182 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7de2029-f003-4278-9b43-c6a9b2246817-utilities\") pod \"redhat-marketplace-842c5\" (UID: \"b7de2029-f003-4278-9b43-c6a9b2246817\") " pod="openshift-marketplace/redhat-marketplace-842c5" Oct 07 19:44:57 crc kubenswrapper[4825]: I1007 19:44:57.393601 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7de2029-f003-4278-9b43-c6a9b2246817-catalog-content\") pod \"redhat-marketplace-842c5\" (UID: \"b7de2029-f003-4278-9b43-c6a9b2246817\") " pod="openshift-marketplace/redhat-marketplace-842c5" Oct 07 19:44:57 crc kubenswrapper[4825]: I1007 19:44:57.414518 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8vkc\" (UniqueName: \"kubernetes.io/projected/b7de2029-f003-4278-9b43-c6a9b2246817-kube-api-access-t8vkc\") pod \"redhat-marketplace-842c5\" (UID: \"b7de2029-f003-4278-9b43-c6a9b2246817\") " pod="openshift-marketplace/redhat-marketplace-842c5" Oct 07 19:44:57 crc kubenswrapper[4825]: I1007 19:44:57.511361 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-842c5" Oct 07 19:44:57 crc kubenswrapper[4825]: I1007 19:44:57.958652 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-842c5"] Oct 07 19:44:58 crc kubenswrapper[4825]: I1007 19:44:58.706662 4825 generic.go:334] "Generic (PLEG): container finished" podID="b7de2029-f003-4278-9b43-c6a9b2246817" containerID="c7718794702e3b62bcc053e625f81dbb1b2a02220d438714748aefda11b0c67c" exitCode=0 Oct 07 19:44:58 crc kubenswrapper[4825]: I1007 19:44:58.706773 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-842c5" event={"ID":"b7de2029-f003-4278-9b43-c6a9b2246817","Type":"ContainerDied","Data":"c7718794702e3b62bcc053e625f81dbb1b2a02220d438714748aefda11b0c67c"} Oct 07 19:44:58 crc kubenswrapper[4825]: I1007 19:44:58.707183 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-842c5" event={"ID":"b7de2029-f003-4278-9b43-c6a9b2246817","Type":"ContainerStarted","Data":"3d0536e8ae673d695f2b4d6b418cadcd5b45f8f600363d482e6a688ad01bfdf2"} Oct 07 19:44:58 crc kubenswrapper[4825]: I1007 19:44:58.711044 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 19:45:00 crc kubenswrapper[4825]: I1007 19:45:00.162369 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331105-r8hn7"] Oct 07 19:45:00 crc kubenswrapper[4825]: I1007 19:45:00.164481 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331105-r8hn7" Oct 07 19:45:00 crc kubenswrapper[4825]: I1007 19:45:00.166812 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 19:45:00 crc kubenswrapper[4825]: I1007 19:45:00.167963 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 19:45:00 crc kubenswrapper[4825]: I1007 19:45:00.178039 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331105-r8hn7"] Oct 07 19:45:00 crc kubenswrapper[4825]: I1007 19:45:00.256178 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93891dce-ec75-4246-be96-ee293b00534f-secret-volume\") pod \"collect-profiles-29331105-r8hn7\" (UID: \"93891dce-ec75-4246-be96-ee293b00534f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331105-r8hn7" Oct 07 19:45:00 crc kubenswrapper[4825]: I1007 19:45:00.256361 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93891dce-ec75-4246-be96-ee293b00534f-config-volume\") pod \"collect-profiles-29331105-r8hn7\" (UID: \"93891dce-ec75-4246-be96-ee293b00534f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331105-r8hn7" Oct 07 19:45:00 crc kubenswrapper[4825]: I1007 19:45:00.256425 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ktdz\" (UniqueName: \"kubernetes.io/projected/93891dce-ec75-4246-be96-ee293b00534f-kube-api-access-7ktdz\") pod \"collect-profiles-29331105-r8hn7\" (UID: \"93891dce-ec75-4246-be96-ee293b00534f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331105-r8hn7" Oct 07 19:45:00 crc kubenswrapper[4825]: I1007 19:45:00.358651 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93891dce-ec75-4246-be96-ee293b00534f-secret-volume\") pod \"collect-profiles-29331105-r8hn7\" (UID: \"93891dce-ec75-4246-be96-ee293b00534f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331105-r8hn7" Oct 07 19:45:00 crc kubenswrapper[4825]: I1007 19:45:00.358836 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93891dce-ec75-4246-be96-ee293b00534f-config-volume\") pod \"collect-profiles-29331105-r8hn7\" (UID: \"93891dce-ec75-4246-be96-ee293b00534f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331105-r8hn7" Oct 07 19:45:00 crc kubenswrapper[4825]: I1007 19:45:00.358954 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ktdz\" (UniqueName: \"kubernetes.io/projected/93891dce-ec75-4246-be96-ee293b00534f-kube-api-access-7ktdz\") pod \"collect-profiles-29331105-r8hn7\" (UID: \"93891dce-ec75-4246-be96-ee293b00534f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331105-r8hn7" Oct 07 19:45:00 crc kubenswrapper[4825]: I1007 19:45:00.360163 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93891dce-ec75-4246-be96-ee293b00534f-config-volume\") pod \"collect-profiles-29331105-r8hn7\" (UID: \"93891dce-ec75-4246-be96-ee293b00534f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331105-r8hn7" Oct 07 19:45:00 crc kubenswrapper[4825]: I1007 19:45:00.368146 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93891dce-ec75-4246-be96-ee293b00534f-secret-volume\") pod \"collect-profiles-29331105-r8hn7\" (UID: \"93891dce-ec75-4246-be96-ee293b00534f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331105-r8hn7" Oct 07 19:45:00 crc kubenswrapper[4825]: I1007 19:45:00.379470 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ktdz\" (UniqueName: \"kubernetes.io/projected/93891dce-ec75-4246-be96-ee293b00534f-kube-api-access-7ktdz\") pod \"collect-profiles-29331105-r8hn7\" (UID: \"93891dce-ec75-4246-be96-ee293b00534f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331105-r8hn7" Oct 07 19:45:00 crc kubenswrapper[4825]: I1007 19:45:00.492811 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331105-r8hn7" Oct 07 19:45:00 crc kubenswrapper[4825]: I1007 19:45:00.729644 4825 generic.go:334] "Generic (PLEG): container finished" podID="b7de2029-f003-4278-9b43-c6a9b2246817" containerID="5325fff2301f6fba62e217742a47e79ee076fb086bbb8372e9a15ad89684b0ba" exitCode=0 Oct 07 19:45:00 crc kubenswrapper[4825]: I1007 19:45:00.729872 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-842c5" event={"ID":"b7de2029-f003-4278-9b43-c6a9b2246817","Type":"ContainerDied","Data":"5325fff2301f6fba62e217742a47e79ee076fb086bbb8372e9a15ad89684b0ba"} Oct 07 19:45:00 crc kubenswrapper[4825]: I1007 19:45:00.964897 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331105-r8hn7"] Oct 07 19:45:01 crc kubenswrapper[4825]: I1007 19:45:01.746305 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-842c5" event={"ID":"b7de2029-f003-4278-9b43-c6a9b2246817","Type":"ContainerStarted","Data":"0a1daa073534d9cd0e86b34742e323063dbb6e85105fc719f09b770be19b7514"} Oct 07 19:45:01 crc kubenswrapper[4825]: I1007 19:45:01.752414 4825 generic.go:334] "Generic (PLEG): container finished" podID="93891dce-ec75-4246-be96-ee293b00534f" containerID="37cc424dd9260ce264959381e1b5d8f867bb8c1232c48c817c1bc712ebc3e16c" exitCode=0 Oct 07 19:45:01 crc kubenswrapper[4825]: I1007 19:45:01.752490 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331105-r8hn7" event={"ID":"93891dce-ec75-4246-be96-ee293b00534f","Type":"ContainerDied","Data":"37cc424dd9260ce264959381e1b5d8f867bb8c1232c48c817c1bc712ebc3e16c"} Oct 07 19:45:01 crc kubenswrapper[4825]: I1007 19:45:01.752531 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331105-r8hn7" event={"ID":"93891dce-ec75-4246-be96-ee293b00534f","Type":"ContainerStarted","Data":"8971e389ff73731b4539a49b365f9d86e9b486340e1ce12b7268e4c5c64c0d82"} Oct 07 19:45:01 crc kubenswrapper[4825]: I1007 19:45:01.780385 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-842c5" podStartSLOduration=2.152093887 podStartE2EDuration="4.780368849s" podCreationTimestamp="2025-10-07 19:44:57 +0000 UTC" firstStartedPulling="2025-10-07 19:44:58.710405737 +0000 UTC m=+2687.532444414" lastFinishedPulling="2025-10-07 19:45:01.338680699 +0000 UTC m=+2690.160719376" observedRunningTime="2025-10-07 19:45:01.777483304 +0000 UTC m=+2690.599521951" watchObservedRunningTime="2025-10-07 19:45:01.780368849 +0000 UTC m=+2690.602407496" Oct 07 19:45:03 crc kubenswrapper[4825]: I1007 19:45:03.164337 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331105-r8hn7" Oct 07 19:45:03 crc kubenswrapper[4825]: I1007 19:45:03.317699 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ktdz\" (UniqueName: \"kubernetes.io/projected/93891dce-ec75-4246-be96-ee293b00534f-kube-api-access-7ktdz\") pod \"93891dce-ec75-4246-be96-ee293b00534f\" (UID: \"93891dce-ec75-4246-be96-ee293b00534f\") " Oct 07 19:45:03 crc kubenswrapper[4825]: I1007 19:45:03.318216 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93891dce-ec75-4246-be96-ee293b00534f-config-volume\") pod \"93891dce-ec75-4246-be96-ee293b00534f\" (UID: \"93891dce-ec75-4246-be96-ee293b00534f\") " Oct 07 19:45:03 crc kubenswrapper[4825]: I1007 19:45:03.318299 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93891dce-ec75-4246-be96-ee293b00534f-secret-volume\") pod \"93891dce-ec75-4246-be96-ee293b00534f\" (UID: \"93891dce-ec75-4246-be96-ee293b00534f\") " Oct 07 19:45:03 crc kubenswrapper[4825]: I1007 19:45:03.318933 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93891dce-ec75-4246-be96-ee293b00534f-config-volume" (OuterVolumeSpecName: "config-volume") pod "93891dce-ec75-4246-be96-ee293b00534f" (UID: "93891dce-ec75-4246-be96-ee293b00534f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:45:03 crc kubenswrapper[4825]: I1007 19:45:03.328339 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93891dce-ec75-4246-be96-ee293b00534f-kube-api-access-7ktdz" (OuterVolumeSpecName: "kube-api-access-7ktdz") pod "93891dce-ec75-4246-be96-ee293b00534f" (UID: "93891dce-ec75-4246-be96-ee293b00534f"). InnerVolumeSpecName "kube-api-access-7ktdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:45:03 crc kubenswrapper[4825]: I1007 19:45:03.330060 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93891dce-ec75-4246-be96-ee293b00534f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "93891dce-ec75-4246-be96-ee293b00534f" (UID: "93891dce-ec75-4246-be96-ee293b00534f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:45:03 crc kubenswrapper[4825]: I1007 19:45:03.419904 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ktdz\" (UniqueName: \"kubernetes.io/projected/93891dce-ec75-4246-be96-ee293b00534f-kube-api-access-7ktdz\") on node \"crc\" DevicePath \"\"" Oct 07 19:45:03 crc kubenswrapper[4825]: I1007 19:45:03.419958 4825 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93891dce-ec75-4246-be96-ee293b00534f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 19:45:03 crc kubenswrapper[4825]: I1007 19:45:03.419972 4825 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93891dce-ec75-4246-be96-ee293b00534f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 19:45:03 crc kubenswrapper[4825]: I1007 19:45:03.794285 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331105-r8hn7" Oct 07 19:45:03 crc kubenswrapper[4825]: I1007 19:45:03.819865 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331105-r8hn7" event={"ID":"93891dce-ec75-4246-be96-ee293b00534f","Type":"ContainerDied","Data":"8971e389ff73731b4539a49b365f9d86e9b486340e1ce12b7268e4c5c64c0d82"} Oct 07 19:45:03 crc kubenswrapper[4825]: I1007 19:45:03.819992 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8971e389ff73731b4539a49b365f9d86e9b486340e1ce12b7268e4c5c64c0d82" Oct 07 19:45:04 crc kubenswrapper[4825]: I1007 19:45:04.275245 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331060-f6rbn"] Oct 07 19:45:04 crc kubenswrapper[4825]: I1007 19:45:04.283523 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331060-f6rbn"] Oct 07 19:45:05 crc kubenswrapper[4825]: I1007 19:45:05.810472 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b305d341-f68e-40db-b37c-11660cdac447" path="/var/lib/kubelet/pods/b305d341-f68e-40db-b37c-11660cdac447/volumes" Oct 07 19:45:06 crc kubenswrapper[4825]: I1007 19:45:06.640100 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vr2tg"] Oct 07 19:45:06 crc kubenswrapper[4825]: E1007 19:45:06.641164 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93891dce-ec75-4246-be96-ee293b00534f" containerName="collect-profiles" Oct 07 19:45:06 crc kubenswrapper[4825]: I1007 19:45:06.641193 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="93891dce-ec75-4246-be96-ee293b00534f" containerName="collect-profiles" Oct 07 19:45:06 crc kubenswrapper[4825]: I1007 19:45:06.641577 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="93891dce-ec75-4246-be96-ee293b00534f" containerName="collect-profiles" Oct 07 19:45:06 crc kubenswrapper[4825]: I1007 19:45:06.644219 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vr2tg" Oct 07 19:45:06 crc kubenswrapper[4825]: I1007 19:45:06.670317 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vr2tg"] Oct 07 19:45:06 crc kubenswrapper[4825]: I1007 19:45:06.790173 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrzf4\" (UniqueName: \"kubernetes.io/projected/7e4a6aa1-3729-4f9a-b071-511d3246e908-kube-api-access-vrzf4\") pod \"certified-operators-vr2tg\" (UID: \"7e4a6aa1-3729-4f9a-b071-511d3246e908\") " pod="openshift-marketplace/certified-operators-vr2tg" Oct 07 19:45:06 crc kubenswrapper[4825]: I1007 19:45:06.790294 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e4a6aa1-3729-4f9a-b071-511d3246e908-catalog-content\") pod \"certified-operators-vr2tg\" (UID: \"7e4a6aa1-3729-4f9a-b071-511d3246e908\") " pod="openshift-marketplace/certified-operators-vr2tg" Oct 07 19:45:06 crc kubenswrapper[4825]: I1007 19:45:06.790455 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e4a6aa1-3729-4f9a-b071-511d3246e908-utilities\") pod \"certified-operators-vr2tg\" (UID: \"7e4a6aa1-3729-4f9a-b071-511d3246e908\") " pod="openshift-marketplace/certified-operators-vr2tg" Oct 07 19:45:06 crc kubenswrapper[4825]: I1007 19:45:06.892403 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e4a6aa1-3729-4f9a-b071-511d3246e908-utilities\") pod \"certified-operators-vr2tg\" (UID: \"7e4a6aa1-3729-4f9a-b071-511d3246e908\") " pod="openshift-marketplace/certified-operators-vr2tg" Oct 07 19:45:06 crc kubenswrapper[4825]: I1007 19:45:06.892572 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrzf4\" (UniqueName: \"kubernetes.io/projected/7e4a6aa1-3729-4f9a-b071-511d3246e908-kube-api-access-vrzf4\") pod \"certified-operators-vr2tg\" (UID: \"7e4a6aa1-3729-4f9a-b071-511d3246e908\") " pod="openshift-marketplace/certified-operators-vr2tg" Oct 07 19:45:06 crc kubenswrapper[4825]: I1007 19:45:06.892636 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e4a6aa1-3729-4f9a-b071-511d3246e908-catalog-content\") pod \"certified-operators-vr2tg\" (UID: \"7e4a6aa1-3729-4f9a-b071-511d3246e908\") " pod="openshift-marketplace/certified-operators-vr2tg" Oct 07 19:45:06 crc kubenswrapper[4825]: I1007 19:45:06.893672 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e4a6aa1-3729-4f9a-b071-511d3246e908-utilities\") pod \"certified-operators-vr2tg\" (UID: \"7e4a6aa1-3729-4f9a-b071-511d3246e908\") " pod="openshift-marketplace/certified-operators-vr2tg" Oct 07 19:45:06 crc kubenswrapper[4825]: I1007 19:45:06.893862 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e4a6aa1-3729-4f9a-b071-511d3246e908-catalog-content\") pod \"certified-operators-vr2tg\" (UID: \"7e4a6aa1-3729-4f9a-b071-511d3246e908\") " pod="openshift-marketplace/certified-operators-vr2tg" Oct 07 19:45:06 crc kubenswrapper[4825]: I1007 19:45:06.924018 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrzf4\" (UniqueName: \"kubernetes.io/projected/7e4a6aa1-3729-4f9a-b071-511d3246e908-kube-api-access-vrzf4\") pod \"certified-operators-vr2tg\" (UID: \"7e4a6aa1-3729-4f9a-b071-511d3246e908\") " pod="openshift-marketplace/certified-operators-vr2tg" Oct 07 19:45:06 crc kubenswrapper[4825]: I1007 19:45:06.983207 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vr2tg" Oct 07 19:45:07 crc kubenswrapper[4825]: I1007 19:45:07.475757 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vr2tg"] Oct 07 19:45:07 crc kubenswrapper[4825]: I1007 19:45:07.511550 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-842c5" Oct 07 19:45:07 crc kubenswrapper[4825]: I1007 19:45:07.511589 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-842c5" Oct 07 19:45:07 crc kubenswrapper[4825]: I1007 19:45:07.562092 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-842c5" Oct 07 19:45:07 crc kubenswrapper[4825]: I1007 19:45:07.842741 4825 generic.go:334] "Generic (PLEG): container finished" podID="7e4a6aa1-3729-4f9a-b071-511d3246e908" containerID="26b28e4626a05e2f27f8cf6529557ff4f2ab5649117eb591568875c99c309dc7" exitCode=0 Oct 07 19:45:07 crc kubenswrapper[4825]: I1007 19:45:07.842800 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vr2tg" event={"ID":"7e4a6aa1-3729-4f9a-b071-511d3246e908","Type":"ContainerDied","Data":"26b28e4626a05e2f27f8cf6529557ff4f2ab5649117eb591568875c99c309dc7"} Oct 07 19:45:07 crc kubenswrapper[4825]: I1007 19:45:07.843388 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vr2tg" event={"ID":"7e4a6aa1-3729-4f9a-b071-511d3246e908","Type":"ContainerStarted","Data":"b8bddd3ec43c954b04009c0fce3b007c4e52cbe1c6a6b9f97f1e18b04dbb331b"} Oct 07 19:45:07 crc kubenswrapper[4825]: I1007 19:45:07.910598 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-842c5" Oct 07 19:45:08 crc kubenswrapper[4825]: I1007 19:45:08.853346 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vr2tg" event={"ID":"7e4a6aa1-3729-4f9a-b071-511d3246e908","Type":"ContainerStarted","Data":"811f1cdb88c27a6b38d1b6886cc1b4ae652beb3f6b44c1c0c5618b40e61c6b54"} Oct 07 19:45:09 crc kubenswrapper[4825]: I1007 19:45:09.814373 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-842c5"] Oct 07 19:45:09 crc kubenswrapper[4825]: I1007 19:45:09.865188 4825 generic.go:334] "Generic (PLEG): container finished" podID="7e4a6aa1-3729-4f9a-b071-511d3246e908" containerID="811f1cdb88c27a6b38d1b6886cc1b4ae652beb3f6b44c1c0c5618b40e61c6b54" exitCode=0 Oct 07 19:45:09 crc kubenswrapper[4825]: I1007 19:45:09.865494 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vr2tg" event={"ID":"7e4a6aa1-3729-4f9a-b071-511d3246e908","Type":"ContainerDied","Data":"811f1cdb88c27a6b38d1b6886cc1b4ae652beb3f6b44c1c0c5618b40e61c6b54"} Oct 07 19:45:09 crc kubenswrapper[4825]: I1007 19:45:09.865606 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-842c5" podUID="b7de2029-f003-4278-9b43-c6a9b2246817" containerName="registry-server" containerID="cri-o://0a1daa073534d9cd0e86b34742e323063dbb6e85105fc719f09b770be19b7514" gracePeriod=2 Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.423559 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-842c5" Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.578643 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7de2029-f003-4278-9b43-c6a9b2246817-utilities\") pod \"b7de2029-f003-4278-9b43-c6a9b2246817\" (UID: \"b7de2029-f003-4278-9b43-c6a9b2246817\") " Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.578844 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8vkc\" (UniqueName: \"kubernetes.io/projected/b7de2029-f003-4278-9b43-c6a9b2246817-kube-api-access-t8vkc\") pod \"b7de2029-f003-4278-9b43-c6a9b2246817\" (UID: \"b7de2029-f003-4278-9b43-c6a9b2246817\") " Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.579175 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7de2029-f003-4278-9b43-c6a9b2246817-catalog-content\") pod \"b7de2029-f003-4278-9b43-c6a9b2246817\" (UID: \"b7de2029-f003-4278-9b43-c6a9b2246817\") " Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.579435 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7de2029-f003-4278-9b43-c6a9b2246817-utilities" (OuterVolumeSpecName: "utilities") pod "b7de2029-f003-4278-9b43-c6a9b2246817" (UID: "b7de2029-f003-4278-9b43-c6a9b2246817"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.579947 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7de2029-f003-4278-9b43-c6a9b2246817-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.586787 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7de2029-f003-4278-9b43-c6a9b2246817-kube-api-access-t8vkc" (OuterVolumeSpecName: "kube-api-access-t8vkc") pod "b7de2029-f003-4278-9b43-c6a9b2246817" (UID: "b7de2029-f003-4278-9b43-c6a9b2246817"). InnerVolumeSpecName "kube-api-access-t8vkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.596903 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7de2029-f003-4278-9b43-c6a9b2246817-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7de2029-f003-4278-9b43-c6a9b2246817" (UID: "b7de2029-f003-4278-9b43-c6a9b2246817"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.681541 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7de2029-f003-4278-9b43-c6a9b2246817-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.681572 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8vkc\" (UniqueName: \"kubernetes.io/projected/b7de2029-f003-4278-9b43-c6a9b2246817-kube-api-access-t8vkc\") on node \"crc\" DevicePath \"\"" Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.879329 4825 generic.go:334] "Generic (PLEG): container finished" podID="b7de2029-f003-4278-9b43-c6a9b2246817" containerID="0a1daa073534d9cd0e86b34742e323063dbb6e85105fc719f09b770be19b7514" exitCode=0 Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.879393 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-842c5" event={"ID":"b7de2029-f003-4278-9b43-c6a9b2246817","Type":"ContainerDied","Data":"0a1daa073534d9cd0e86b34742e323063dbb6e85105fc719f09b770be19b7514"} Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.879420 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-842c5" Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.879459 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-842c5" event={"ID":"b7de2029-f003-4278-9b43-c6a9b2246817","Type":"ContainerDied","Data":"3d0536e8ae673d695f2b4d6b418cadcd5b45f8f600363d482e6a688ad01bfdf2"} Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.879505 4825 scope.go:117] "RemoveContainer" containerID="0a1daa073534d9cd0e86b34742e323063dbb6e85105fc719f09b770be19b7514" Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.883123 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vr2tg" event={"ID":"7e4a6aa1-3729-4f9a-b071-511d3246e908","Type":"ContainerStarted","Data":"abc2099340e2771641783ba5221b70efb3a8b1cf87049452780fdf87a921f730"} Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.912348 4825 scope.go:117] "RemoveContainer" containerID="5325fff2301f6fba62e217742a47e79ee076fb086bbb8372e9a15ad89684b0ba" Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.927286 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vr2tg" podStartSLOduration=2.484691986 podStartE2EDuration="4.927270321s" podCreationTimestamp="2025-10-07 19:45:06 +0000 UTC" firstStartedPulling="2025-10-07 19:45:07.845880964 +0000 UTC m=+2696.667919601" lastFinishedPulling="2025-10-07 19:45:10.288459289 +0000 UTC m=+2699.110497936" observedRunningTime="2025-10-07 19:45:10.900051126 +0000 UTC m=+2699.722089793" watchObservedRunningTime="2025-10-07 19:45:10.927270321 +0000 UTC m=+2699.749308948" Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.941288 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-842c5"] Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.942024 4825 scope.go:117] "RemoveContainer" containerID="c7718794702e3b62bcc053e625f81dbb1b2a02220d438714748aefda11b0c67c" Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.947048 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-842c5"] Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.984120 4825 scope.go:117] "RemoveContainer" containerID="0a1daa073534d9cd0e86b34742e323063dbb6e85105fc719f09b770be19b7514" Oct 07 19:45:10 crc kubenswrapper[4825]: E1007 19:45:10.984705 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a1daa073534d9cd0e86b34742e323063dbb6e85105fc719f09b770be19b7514\": container with ID starting with 0a1daa073534d9cd0e86b34742e323063dbb6e85105fc719f09b770be19b7514 not found: ID does not exist" containerID="0a1daa073534d9cd0e86b34742e323063dbb6e85105fc719f09b770be19b7514" Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.984759 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a1daa073534d9cd0e86b34742e323063dbb6e85105fc719f09b770be19b7514"} err="failed to get container status \"0a1daa073534d9cd0e86b34742e323063dbb6e85105fc719f09b770be19b7514\": rpc error: code = NotFound desc = could not find container \"0a1daa073534d9cd0e86b34742e323063dbb6e85105fc719f09b770be19b7514\": container with ID starting with 0a1daa073534d9cd0e86b34742e323063dbb6e85105fc719f09b770be19b7514 not found: ID does not exist" Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.984789 4825 scope.go:117] "RemoveContainer" containerID="5325fff2301f6fba62e217742a47e79ee076fb086bbb8372e9a15ad89684b0ba" Oct 07 19:45:10 crc kubenswrapper[4825]: E1007 19:45:10.985355 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5325fff2301f6fba62e217742a47e79ee076fb086bbb8372e9a15ad89684b0ba\": container with ID starting with 5325fff2301f6fba62e217742a47e79ee076fb086bbb8372e9a15ad89684b0ba not found: ID does not exist" containerID="5325fff2301f6fba62e217742a47e79ee076fb086bbb8372e9a15ad89684b0ba" Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.985422 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5325fff2301f6fba62e217742a47e79ee076fb086bbb8372e9a15ad89684b0ba"} err="failed to get container status \"5325fff2301f6fba62e217742a47e79ee076fb086bbb8372e9a15ad89684b0ba\": rpc error: code = NotFound desc = could not find container \"5325fff2301f6fba62e217742a47e79ee076fb086bbb8372e9a15ad89684b0ba\": container with ID starting with 5325fff2301f6fba62e217742a47e79ee076fb086bbb8372e9a15ad89684b0ba not found: ID does not exist" Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.985447 4825 scope.go:117] "RemoveContainer" containerID="c7718794702e3b62bcc053e625f81dbb1b2a02220d438714748aefda11b0c67c" Oct 07 19:45:10 crc kubenswrapper[4825]: E1007 19:45:10.985822 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7718794702e3b62bcc053e625f81dbb1b2a02220d438714748aefda11b0c67c\": container with ID starting with c7718794702e3b62bcc053e625f81dbb1b2a02220d438714748aefda11b0c67c not found: ID does not exist" containerID="c7718794702e3b62bcc053e625f81dbb1b2a02220d438714748aefda11b0c67c" Oct 07 19:45:10 crc kubenswrapper[4825]: I1007 19:45:10.985862 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7718794702e3b62bcc053e625f81dbb1b2a02220d438714748aefda11b0c67c"} err="failed to get container status \"c7718794702e3b62bcc053e625f81dbb1b2a02220d438714748aefda11b0c67c\": rpc error: code = NotFound desc = could not find container \"c7718794702e3b62bcc053e625f81dbb1b2a02220d438714748aefda11b0c67c\": container with ID starting with c7718794702e3b62bcc053e625f81dbb1b2a02220d438714748aefda11b0c67c not found: ID does not exist" Oct 07 19:45:11 crc kubenswrapper[4825]: I1007 19:45:11.810595 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7de2029-f003-4278-9b43-c6a9b2246817" path="/var/lib/kubelet/pods/b7de2029-f003-4278-9b43-c6a9b2246817/volumes" Oct 07 19:45:16 crc kubenswrapper[4825]: I1007 19:45:16.984439 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vr2tg" Oct 07 19:45:16 crc kubenswrapper[4825]: I1007 19:45:16.984805 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vr2tg" Oct 07 19:45:17 crc kubenswrapper[4825]: I1007 19:45:17.042842 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vr2tg" Oct 07 19:45:18 crc kubenswrapper[4825]: I1007 19:45:18.031591 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vr2tg" Oct 07 19:45:18 crc kubenswrapper[4825]: I1007 19:45:18.086073 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vr2tg"] Oct 07 19:45:18 crc kubenswrapper[4825]: I1007 19:45:18.988123 4825 generic.go:334] "Generic (PLEG): container finished" podID="803e0c1d-979b-47da-ba63-cad0323972a8" containerID="e5a75b5f16c4d9e584c2d6f7d2166fb9c4265bd997dd129d0343520e43f24779" exitCode=0 Oct 07 19:45:18 crc kubenswrapper[4825]: I1007 19:45:18.988173 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" event={"ID":"803e0c1d-979b-47da-ba63-cad0323972a8","Type":"ContainerDied","Data":"e5a75b5f16c4d9e584c2d6f7d2166fb9c4265bd997dd129d0343520e43f24779"} Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.001377 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vr2tg" podUID="7e4a6aa1-3729-4f9a-b071-511d3246e908" containerName="registry-server" containerID="cri-o://abc2099340e2771641783ba5221b70efb3a8b1cf87049452780fdf87a921f730" gracePeriod=2 Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.530602 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.539453 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vr2tg" Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.689894 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-ceilometer-compute-config-data-1\") pod \"803e0c1d-979b-47da-ba63-cad0323972a8\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.690628 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnrgf\" (UniqueName: \"kubernetes.io/projected/803e0c1d-979b-47da-ba63-cad0323972a8-kube-api-access-wnrgf\") pod \"803e0c1d-979b-47da-ba63-cad0323972a8\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.690674 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-ceilometer-compute-config-data-0\") pod \"803e0c1d-979b-47da-ba63-cad0323972a8\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.690696 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-ceilometer-compute-config-data-2\") pod \"803e0c1d-979b-47da-ba63-cad0323972a8\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.690727 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e4a6aa1-3729-4f9a-b071-511d3246e908-catalog-content\") pod \"7e4a6aa1-3729-4f9a-b071-511d3246e908\" (UID: \"7e4a6aa1-3729-4f9a-b071-511d3246e908\") " Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.690773 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e4a6aa1-3729-4f9a-b071-511d3246e908-utilities\") pod \"7e4a6aa1-3729-4f9a-b071-511d3246e908\" (UID: \"7e4a6aa1-3729-4f9a-b071-511d3246e908\") " Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.690827 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrzf4\" (UniqueName: \"kubernetes.io/projected/7e4a6aa1-3729-4f9a-b071-511d3246e908-kube-api-access-vrzf4\") pod \"7e4a6aa1-3729-4f9a-b071-511d3246e908\" (UID: \"7e4a6aa1-3729-4f9a-b071-511d3246e908\") " Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.690848 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-ssh-key\") pod \"803e0c1d-979b-47da-ba63-cad0323972a8\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.690911 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-telemetry-combined-ca-bundle\") pod \"803e0c1d-979b-47da-ba63-cad0323972a8\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.690994 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-inventory\") pod \"803e0c1d-979b-47da-ba63-cad0323972a8\" (UID: \"803e0c1d-979b-47da-ba63-cad0323972a8\") " Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.691977 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e4a6aa1-3729-4f9a-b071-511d3246e908-utilities" (OuterVolumeSpecName: "utilities") pod "7e4a6aa1-3729-4f9a-b071-511d3246e908" (UID: "7e4a6aa1-3729-4f9a-b071-511d3246e908"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.695306 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/803e0c1d-979b-47da-ba63-cad0323972a8-kube-api-access-wnrgf" (OuterVolumeSpecName: "kube-api-access-wnrgf") pod "803e0c1d-979b-47da-ba63-cad0323972a8" (UID: "803e0c1d-979b-47da-ba63-cad0323972a8"). InnerVolumeSpecName "kube-api-access-wnrgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.696595 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "803e0c1d-979b-47da-ba63-cad0323972a8" (UID: "803e0c1d-979b-47da-ba63-cad0323972a8"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.701722 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e4a6aa1-3729-4f9a-b071-511d3246e908-kube-api-access-vrzf4" (OuterVolumeSpecName: "kube-api-access-vrzf4") pod "7e4a6aa1-3729-4f9a-b071-511d3246e908" (UID: "7e4a6aa1-3729-4f9a-b071-511d3246e908"). InnerVolumeSpecName "kube-api-access-vrzf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.722534 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-inventory" (OuterVolumeSpecName: "inventory") pod "803e0c1d-979b-47da-ba63-cad0323972a8" (UID: "803e0c1d-979b-47da-ba63-cad0323972a8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.723518 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "803e0c1d-979b-47da-ba63-cad0323972a8" (UID: "803e0c1d-979b-47da-ba63-cad0323972a8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.726456 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "803e0c1d-979b-47da-ba63-cad0323972a8" (UID: "803e0c1d-979b-47da-ba63-cad0323972a8"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.726942 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "803e0c1d-979b-47da-ba63-cad0323972a8" (UID: "803e0c1d-979b-47da-ba63-cad0323972a8"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.728295 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "803e0c1d-979b-47da-ba63-cad0323972a8" (UID: "803e0c1d-979b-47da-ba63-cad0323972a8"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.743996 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e4a6aa1-3729-4f9a-b071-511d3246e908-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e4a6aa1-3729-4f9a-b071-511d3246e908" (UID: "7e4a6aa1-3729-4f9a-b071-511d3246e908"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.793538 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e4a6aa1-3729-4f9a-b071-511d3246e908-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.793572 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrzf4\" (UniqueName: \"kubernetes.io/projected/7e4a6aa1-3729-4f9a-b071-511d3246e908-kube-api-access-vrzf4\") on node \"crc\" DevicePath \"\"" Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.793587 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.793599 4825 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.793611 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.793623 4825 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.793635 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnrgf\" (UniqueName: \"kubernetes.io/projected/803e0c1d-979b-47da-ba63-cad0323972a8-kube-api-access-wnrgf\") on node \"crc\" DevicePath \"\"" Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.793649 4825 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.793663 4825 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/803e0c1d-979b-47da-ba63-cad0323972a8-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 07 19:45:20 crc kubenswrapper[4825]: I1007 19:45:20.793675 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e4a6aa1-3729-4f9a-b071-511d3246e908-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:45:21 crc kubenswrapper[4825]: I1007 19:45:21.012200 4825 generic.go:334] "Generic (PLEG): container finished" podID="7e4a6aa1-3729-4f9a-b071-511d3246e908" containerID="abc2099340e2771641783ba5221b70efb3a8b1cf87049452780fdf87a921f730" exitCode=0 Oct 07 19:45:21 crc kubenswrapper[4825]: I1007 19:45:21.012291 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vr2tg" Oct 07 19:45:21 crc kubenswrapper[4825]: I1007 19:45:21.012280 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vr2tg" event={"ID":"7e4a6aa1-3729-4f9a-b071-511d3246e908","Type":"ContainerDied","Data":"abc2099340e2771641783ba5221b70efb3a8b1cf87049452780fdf87a921f730"} Oct 07 19:45:21 crc kubenswrapper[4825]: I1007 19:45:21.012456 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vr2tg" event={"ID":"7e4a6aa1-3729-4f9a-b071-511d3246e908","Type":"ContainerDied","Data":"b8bddd3ec43c954b04009c0fce3b007c4e52cbe1c6a6b9f97f1e18b04dbb331b"} Oct 07 19:45:21 crc kubenswrapper[4825]: I1007 19:45:21.012490 4825 scope.go:117] "RemoveContainer" containerID="abc2099340e2771641783ba5221b70efb3a8b1cf87049452780fdf87a921f730" Oct 07 19:45:21 crc kubenswrapper[4825]: I1007 19:45:21.015637 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" event={"ID":"803e0c1d-979b-47da-ba63-cad0323972a8","Type":"ContainerDied","Data":"c977d4eb04e171e2a4492c15b234909ca2aee25d7ad6f805e4792be20d7d00d7"} Oct 07 19:45:21 crc kubenswrapper[4825]: I1007 19:45:21.015680 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c977d4eb04e171e2a4492c15b234909ca2aee25d7ad6f805e4792be20d7d00d7" Oct 07 19:45:21 crc kubenswrapper[4825]: I1007 19:45:21.015710 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv" Oct 07 19:45:21 crc kubenswrapper[4825]: I1007 19:45:21.042781 4825 scope.go:117] "RemoveContainer" containerID="811f1cdb88c27a6b38d1b6886cc1b4ae652beb3f6b44c1c0c5618b40e61c6b54" Oct 07 19:45:21 crc kubenswrapper[4825]: I1007 19:45:21.079017 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vr2tg"] Oct 07 19:45:21 crc kubenswrapper[4825]: I1007 19:45:21.087158 4825 scope.go:117] "RemoveContainer" containerID="26b28e4626a05e2f27f8cf6529557ff4f2ab5649117eb591568875c99c309dc7" Oct 07 19:45:21 crc kubenswrapper[4825]: I1007 19:45:21.088014 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vr2tg"] Oct 07 19:45:21 crc kubenswrapper[4825]: I1007 19:45:21.109322 4825 scope.go:117] "RemoveContainer" containerID="abc2099340e2771641783ba5221b70efb3a8b1cf87049452780fdf87a921f730" Oct 07 19:45:21 crc kubenswrapper[4825]: E1007 19:45:21.109683 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abc2099340e2771641783ba5221b70efb3a8b1cf87049452780fdf87a921f730\": container with ID starting with abc2099340e2771641783ba5221b70efb3a8b1cf87049452780fdf87a921f730 not found: ID does not exist" containerID="abc2099340e2771641783ba5221b70efb3a8b1cf87049452780fdf87a921f730" Oct 07 19:45:21 crc kubenswrapper[4825]: I1007 19:45:21.109722 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abc2099340e2771641783ba5221b70efb3a8b1cf87049452780fdf87a921f730"} err="failed to get container status \"abc2099340e2771641783ba5221b70efb3a8b1cf87049452780fdf87a921f730\": rpc error: code = NotFound desc = could not find container \"abc2099340e2771641783ba5221b70efb3a8b1cf87049452780fdf87a921f730\": container with ID starting with abc2099340e2771641783ba5221b70efb3a8b1cf87049452780fdf87a921f730 not found: ID does not exist" Oct 07 19:45:21 crc kubenswrapper[4825]: I1007 19:45:21.109744 4825 scope.go:117] "RemoveContainer" containerID="811f1cdb88c27a6b38d1b6886cc1b4ae652beb3f6b44c1c0c5618b40e61c6b54" Oct 07 19:45:21 crc kubenswrapper[4825]: E1007 19:45:21.110146 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"811f1cdb88c27a6b38d1b6886cc1b4ae652beb3f6b44c1c0c5618b40e61c6b54\": container with ID starting with 811f1cdb88c27a6b38d1b6886cc1b4ae652beb3f6b44c1c0c5618b40e61c6b54 not found: ID does not exist" containerID="811f1cdb88c27a6b38d1b6886cc1b4ae652beb3f6b44c1c0c5618b40e61c6b54" Oct 07 19:45:21 crc kubenswrapper[4825]: I1007 19:45:21.110179 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"811f1cdb88c27a6b38d1b6886cc1b4ae652beb3f6b44c1c0c5618b40e61c6b54"} err="failed to get container status \"811f1cdb88c27a6b38d1b6886cc1b4ae652beb3f6b44c1c0c5618b40e61c6b54\": rpc error: code = NotFound desc = could not find container \"811f1cdb88c27a6b38d1b6886cc1b4ae652beb3f6b44c1c0c5618b40e61c6b54\": container with ID starting with 811f1cdb88c27a6b38d1b6886cc1b4ae652beb3f6b44c1c0c5618b40e61c6b54 not found: ID does not exist" Oct 07 19:45:21 crc kubenswrapper[4825]: I1007 19:45:21.110194 4825 scope.go:117] "RemoveContainer" containerID="26b28e4626a05e2f27f8cf6529557ff4f2ab5649117eb591568875c99c309dc7" Oct 07 19:45:21 crc kubenswrapper[4825]: E1007 19:45:21.110543 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26b28e4626a05e2f27f8cf6529557ff4f2ab5649117eb591568875c99c309dc7\": container with ID starting with 26b28e4626a05e2f27f8cf6529557ff4f2ab5649117eb591568875c99c309dc7 not found: ID does not exist" containerID="26b28e4626a05e2f27f8cf6529557ff4f2ab5649117eb591568875c99c309dc7" Oct 07 19:45:21 crc kubenswrapper[4825]: I1007 19:45:21.110572 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26b28e4626a05e2f27f8cf6529557ff4f2ab5649117eb591568875c99c309dc7"} err="failed to get container status \"26b28e4626a05e2f27f8cf6529557ff4f2ab5649117eb591568875c99c309dc7\": rpc error: code = NotFound desc = could not find container \"26b28e4626a05e2f27f8cf6529557ff4f2ab5649117eb591568875c99c309dc7\": container with ID starting with 26b28e4626a05e2f27f8cf6529557ff4f2ab5649117eb591568875c99c309dc7 not found: ID does not exist" Oct 07 19:45:21 crc kubenswrapper[4825]: I1007 19:45:21.814038 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e4a6aa1-3729-4f9a-b071-511d3246e908" path="/var/lib/kubelet/pods/7e4a6aa1-3729-4f9a-b071-511d3246e908/volumes" Oct 07 19:45:41 crc kubenswrapper[4825]: I1007 19:45:41.131471 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qlhc5"] Oct 07 19:45:41 crc kubenswrapper[4825]: E1007 19:45:41.132987 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e4a6aa1-3729-4f9a-b071-511d3246e908" containerName="registry-server" Oct 07 19:45:41 crc kubenswrapper[4825]: I1007 19:45:41.133024 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e4a6aa1-3729-4f9a-b071-511d3246e908" containerName="registry-server" Oct 07 19:45:41 crc kubenswrapper[4825]: E1007 19:45:41.133072 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7de2029-f003-4278-9b43-c6a9b2246817" containerName="registry-server" Oct 07 19:45:41 crc kubenswrapper[4825]: I1007 19:45:41.133092 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7de2029-f003-4278-9b43-c6a9b2246817" containerName="registry-server" Oct 07 19:45:41 crc kubenswrapper[4825]: E1007 19:45:41.133124 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7de2029-f003-4278-9b43-c6a9b2246817" containerName="extract-content" Oct 07 19:45:41 crc kubenswrapper[4825]: I1007 19:45:41.133141 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7de2029-f003-4278-9b43-c6a9b2246817" containerName="extract-content" Oct 07 19:45:41 crc kubenswrapper[4825]: E1007 19:45:41.133178 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="803e0c1d-979b-47da-ba63-cad0323972a8" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 07 19:45:41 crc kubenswrapper[4825]: I1007 19:45:41.133202 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="803e0c1d-979b-47da-ba63-cad0323972a8" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 07 19:45:41 crc kubenswrapper[4825]: E1007 19:45:41.133287 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7de2029-f003-4278-9b43-c6a9b2246817" containerName="extract-utilities" Oct 07 19:45:41 crc kubenswrapper[4825]: I1007 19:45:41.133308 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7de2029-f003-4278-9b43-c6a9b2246817" containerName="extract-utilities" Oct 07 19:45:41 crc kubenswrapper[4825]: E1007 19:45:41.133332 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e4a6aa1-3729-4f9a-b071-511d3246e908" containerName="extract-utilities" Oct 07 19:45:41 crc kubenswrapper[4825]: I1007 19:45:41.133345 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e4a6aa1-3729-4f9a-b071-511d3246e908" containerName="extract-utilities" Oct 07 19:45:41 crc kubenswrapper[4825]: E1007 19:45:41.133365 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e4a6aa1-3729-4f9a-b071-511d3246e908" containerName="extract-content" Oct 07 19:45:41 crc kubenswrapper[4825]: I1007 19:45:41.133377 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e4a6aa1-3729-4f9a-b071-511d3246e908" containerName="extract-content" Oct 07 19:45:41 crc kubenswrapper[4825]: I1007 19:45:41.133767 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e4a6aa1-3729-4f9a-b071-511d3246e908" containerName="registry-server" Oct 07 19:45:41 crc kubenswrapper[4825]: I1007 19:45:41.133806 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="803e0c1d-979b-47da-ba63-cad0323972a8" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 07 19:45:41 crc kubenswrapper[4825]: I1007 19:45:41.133856 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7de2029-f003-4278-9b43-c6a9b2246817" containerName="registry-server" Oct 07 19:45:41 crc kubenswrapper[4825]: I1007 19:45:41.136850 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qlhc5" Oct 07 19:45:41 crc kubenswrapper[4825]: I1007 19:45:41.145218 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qlhc5"] Oct 07 19:45:41 crc kubenswrapper[4825]: I1007 19:45:41.195463 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40c6a7ea-d336-48ac-8029-674cf1c54b01-utilities\") pod \"community-operators-qlhc5\" (UID: \"40c6a7ea-d336-48ac-8029-674cf1c54b01\") " pod="openshift-marketplace/community-operators-qlhc5" Oct 07 19:45:41 crc kubenswrapper[4825]: I1007 19:45:41.195669 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fhxv\" (UniqueName: \"kubernetes.io/projected/40c6a7ea-d336-48ac-8029-674cf1c54b01-kube-api-access-9fhxv\") pod \"community-operators-qlhc5\" (UID: \"40c6a7ea-d336-48ac-8029-674cf1c54b01\") " pod="openshift-marketplace/community-operators-qlhc5" Oct 07 19:45:41 crc kubenswrapper[4825]: I1007 19:45:41.195710 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40c6a7ea-d336-48ac-8029-674cf1c54b01-catalog-content\") pod \"community-operators-qlhc5\" (UID: \"40c6a7ea-d336-48ac-8029-674cf1c54b01\") " pod="openshift-marketplace/community-operators-qlhc5" Oct 07 19:45:41 crc kubenswrapper[4825]: I1007 19:45:41.300711 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fhxv\" (UniqueName: \"kubernetes.io/projected/40c6a7ea-d336-48ac-8029-674cf1c54b01-kube-api-access-9fhxv\") pod \"community-operators-qlhc5\" (UID: \"40c6a7ea-d336-48ac-8029-674cf1c54b01\") " pod="openshift-marketplace/community-operators-qlhc5" Oct 07 19:45:41 crc kubenswrapper[4825]: I1007 19:45:41.301084 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40c6a7ea-d336-48ac-8029-674cf1c54b01-catalog-content\") pod \"community-operators-qlhc5\" (UID: \"40c6a7ea-d336-48ac-8029-674cf1c54b01\") " pod="openshift-marketplace/community-operators-qlhc5" Oct 07 19:45:41 crc kubenswrapper[4825]: I1007 19:45:41.301145 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40c6a7ea-d336-48ac-8029-674cf1c54b01-utilities\") pod \"community-operators-qlhc5\" (UID: \"40c6a7ea-d336-48ac-8029-674cf1c54b01\") " pod="openshift-marketplace/community-operators-qlhc5" Oct 07 19:45:41 crc kubenswrapper[4825]: I1007 19:45:41.301572 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40c6a7ea-d336-48ac-8029-674cf1c54b01-catalog-content\") pod \"community-operators-qlhc5\" (UID: \"40c6a7ea-d336-48ac-8029-674cf1c54b01\") " pod="openshift-marketplace/community-operators-qlhc5" Oct 07 19:45:41 crc kubenswrapper[4825]: I1007 19:45:41.301677 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40c6a7ea-d336-48ac-8029-674cf1c54b01-utilities\") pod \"community-operators-qlhc5\" (UID: \"40c6a7ea-d336-48ac-8029-674cf1c54b01\") " pod="openshift-marketplace/community-operators-qlhc5" Oct 07 19:45:41 crc kubenswrapper[4825]: I1007 19:45:41.328801 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fhxv\" (UniqueName: \"kubernetes.io/projected/40c6a7ea-d336-48ac-8029-674cf1c54b01-kube-api-access-9fhxv\") pod \"community-operators-qlhc5\" (UID: \"40c6a7ea-d336-48ac-8029-674cf1c54b01\") " pod="openshift-marketplace/community-operators-qlhc5" Oct 07 19:45:41 crc kubenswrapper[4825]: I1007 19:45:41.473464 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qlhc5" Oct 07 19:45:42 crc kubenswrapper[4825]: I1007 19:45:42.021878 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qlhc5"] Oct 07 19:45:42 crc kubenswrapper[4825]: I1007 19:45:42.233902 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qlhc5" event={"ID":"40c6a7ea-d336-48ac-8029-674cf1c54b01","Type":"ContainerStarted","Data":"a2ad7e887748ae94124e8b509ac36ddaf4d84942909e24aba14fcfeba2ed0153"} Oct 07 19:45:43 crc kubenswrapper[4825]: I1007 19:45:43.256574 4825 generic.go:334] "Generic (PLEG): container finished" podID="40c6a7ea-d336-48ac-8029-674cf1c54b01" containerID="ec4aa35f259bde980af05091d37f1ee8684a069ada35ffc419713f6b4263886d" exitCode=0 Oct 07 19:45:43 crc kubenswrapper[4825]: I1007 19:45:43.256625 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qlhc5" event={"ID":"40c6a7ea-d336-48ac-8029-674cf1c54b01","Type":"ContainerDied","Data":"ec4aa35f259bde980af05091d37f1ee8684a069ada35ffc419713f6b4263886d"} Oct 07 19:45:47 crc kubenswrapper[4825]: I1007 19:45:47.305075 4825 generic.go:334] "Generic (PLEG): container finished" podID="40c6a7ea-d336-48ac-8029-674cf1c54b01" containerID="cab9d09d438ef8485d0b8afef41416c287d3bff8fb458fe98bd4a8db16ee31c2" exitCode=0 Oct 07 19:45:47 crc kubenswrapper[4825]: I1007 19:45:47.305144 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qlhc5" event={"ID":"40c6a7ea-d336-48ac-8029-674cf1c54b01","Type":"ContainerDied","Data":"cab9d09d438ef8485d0b8afef41416c287d3bff8fb458fe98bd4a8db16ee31c2"} Oct 07 19:45:49 crc kubenswrapper[4825]: I1007 19:45:49.437005 4825 scope.go:117] "RemoveContainer" containerID="113f6d9686877bcacbab22951fcbeec4eb56b70baa2a93dc1c54271a1ca7e34f" Oct 07 19:45:50 crc kubenswrapper[4825]: I1007 19:45:50.333713 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qlhc5" event={"ID":"40c6a7ea-d336-48ac-8029-674cf1c54b01","Type":"ContainerStarted","Data":"9f6a7824480491d63eeb31dadb0586d47bc14f38493c0f47ef2d57a8792f3b95"} Oct 07 19:45:50 crc kubenswrapper[4825]: I1007 19:45:50.357398 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qlhc5" podStartSLOduration=3.05143558 podStartE2EDuration="9.357379072s" podCreationTimestamp="2025-10-07 19:45:41 +0000 UTC" firstStartedPulling="2025-10-07 19:45:43.261203591 +0000 UTC m=+2732.083242258" lastFinishedPulling="2025-10-07 19:45:49.567147073 +0000 UTC m=+2738.389185750" observedRunningTime="2025-10-07 19:45:50.353180039 +0000 UTC m=+2739.175218736" watchObservedRunningTime="2025-10-07 19:45:50.357379072 +0000 UTC m=+2739.179417709" Oct 07 19:45:51 crc kubenswrapper[4825]: I1007 19:45:51.474071 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qlhc5" Oct 07 19:45:51 crc kubenswrapper[4825]: I1007 19:45:51.474483 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qlhc5" Oct 07 19:45:52 crc kubenswrapper[4825]: I1007 19:45:52.529916 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-qlhc5" podUID="40c6a7ea-d336-48ac-8029-674cf1c54b01" containerName="registry-server" probeResult="failure" output=< Oct 07 19:45:52 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Oct 07 19:45:52 crc kubenswrapper[4825]: > Oct 07 19:46:01 crc kubenswrapper[4825]: I1007 19:46:01.543856 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qlhc5" Oct 07 19:46:01 crc kubenswrapper[4825]: I1007 19:46:01.601563 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qlhc5" Oct 07 19:46:01 crc kubenswrapper[4825]: I1007 19:46:01.785107 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qlhc5"] Oct 07 19:46:03 crc kubenswrapper[4825]: I1007 19:46:03.462672 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qlhc5" podUID="40c6a7ea-d336-48ac-8029-674cf1c54b01" containerName="registry-server" containerID="cri-o://9f6a7824480491d63eeb31dadb0586d47bc14f38493c0f47ef2d57a8792f3b95" gracePeriod=2 Oct 07 19:46:03 crc kubenswrapper[4825]: I1007 19:46:03.904440 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qlhc5" Oct 07 19:46:03 crc kubenswrapper[4825]: I1007 19:46:03.992206 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fhxv\" (UniqueName: \"kubernetes.io/projected/40c6a7ea-d336-48ac-8029-674cf1c54b01-kube-api-access-9fhxv\") pod \"40c6a7ea-d336-48ac-8029-674cf1c54b01\" (UID: \"40c6a7ea-d336-48ac-8029-674cf1c54b01\") " Oct 07 19:46:03 crc kubenswrapper[4825]: I1007 19:46:03.992524 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40c6a7ea-d336-48ac-8029-674cf1c54b01-utilities\") pod \"40c6a7ea-d336-48ac-8029-674cf1c54b01\" (UID: \"40c6a7ea-d336-48ac-8029-674cf1c54b01\") " Oct 07 19:46:03 crc kubenswrapper[4825]: I1007 19:46:03.992645 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40c6a7ea-d336-48ac-8029-674cf1c54b01-catalog-content\") pod \"40c6a7ea-d336-48ac-8029-674cf1c54b01\" (UID: \"40c6a7ea-d336-48ac-8029-674cf1c54b01\") " Oct 07 19:46:03 crc kubenswrapper[4825]: I1007 19:46:03.993857 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40c6a7ea-d336-48ac-8029-674cf1c54b01-utilities" (OuterVolumeSpecName: "utilities") pod "40c6a7ea-d336-48ac-8029-674cf1c54b01" (UID: "40c6a7ea-d336-48ac-8029-674cf1c54b01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:46:03 crc kubenswrapper[4825]: I1007 19:46:03.997907 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40c6a7ea-d336-48ac-8029-674cf1c54b01-kube-api-access-9fhxv" (OuterVolumeSpecName: "kube-api-access-9fhxv") pod "40c6a7ea-d336-48ac-8029-674cf1c54b01" (UID: "40c6a7ea-d336-48ac-8029-674cf1c54b01"). InnerVolumeSpecName "kube-api-access-9fhxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:46:04 crc kubenswrapper[4825]: I1007 19:46:04.057276 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40c6a7ea-d336-48ac-8029-674cf1c54b01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40c6a7ea-d336-48ac-8029-674cf1c54b01" (UID: "40c6a7ea-d336-48ac-8029-674cf1c54b01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:46:04 crc kubenswrapper[4825]: I1007 19:46:04.095163 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fhxv\" (UniqueName: \"kubernetes.io/projected/40c6a7ea-d336-48ac-8029-674cf1c54b01-kube-api-access-9fhxv\") on node \"crc\" DevicePath \"\"" Oct 07 19:46:04 crc kubenswrapper[4825]: I1007 19:46:04.095508 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40c6a7ea-d336-48ac-8029-674cf1c54b01-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:46:04 crc kubenswrapper[4825]: I1007 19:46:04.095519 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40c6a7ea-d336-48ac-8029-674cf1c54b01-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:46:04 crc kubenswrapper[4825]: I1007 19:46:04.478743 4825 generic.go:334] "Generic (PLEG): container finished" podID="40c6a7ea-d336-48ac-8029-674cf1c54b01" containerID="9f6a7824480491d63eeb31dadb0586d47bc14f38493c0f47ef2d57a8792f3b95" exitCode=0 Oct 07 19:46:04 crc kubenswrapper[4825]: I1007 19:46:04.479496 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qlhc5" Oct 07 19:46:04 crc kubenswrapper[4825]: I1007 19:46:04.479568 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qlhc5" event={"ID":"40c6a7ea-d336-48ac-8029-674cf1c54b01","Type":"ContainerDied","Data":"9f6a7824480491d63eeb31dadb0586d47bc14f38493c0f47ef2d57a8792f3b95"} Oct 07 19:46:04 crc kubenswrapper[4825]: I1007 19:46:04.479997 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qlhc5" event={"ID":"40c6a7ea-d336-48ac-8029-674cf1c54b01","Type":"ContainerDied","Data":"a2ad7e887748ae94124e8b509ac36ddaf4d84942909e24aba14fcfeba2ed0153"} Oct 07 19:46:04 crc kubenswrapper[4825]: I1007 19:46:04.480081 4825 scope.go:117] "RemoveContainer" containerID="9f6a7824480491d63eeb31dadb0586d47bc14f38493c0f47ef2d57a8792f3b95" Oct 07 19:46:04 crc kubenswrapper[4825]: I1007 19:46:04.530917 4825 scope.go:117] "RemoveContainer" containerID="cab9d09d438ef8485d0b8afef41416c287d3bff8fb458fe98bd4a8db16ee31c2" Oct 07 19:46:04 crc kubenswrapper[4825]: I1007 19:46:04.535279 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qlhc5"] Oct 07 19:46:04 crc kubenswrapper[4825]: I1007 19:46:04.543527 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qlhc5"] Oct 07 19:46:04 crc kubenswrapper[4825]: I1007 19:46:04.565332 4825 scope.go:117] "RemoveContainer" containerID="ec4aa35f259bde980af05091d37f1ee8684a069ada35ffc419713f6b4263886d" Oct 07 19:46:04 crc kubenswrapper[4825]: I1007 19:46:04.611551 4825 scope.go:117] "RemoveContainer" containerID="9f6a7824480491d63eeb31dadb0586d47bc14f38493c0f47ef2d57a8792f3b95" Oct 07 19:46:04 crc kubenswrapper[4825]: E1007 19:46:04.612097 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f6a7824480491d63eeb31dadb0586d47bc14f38493c0f47ef2d57a8792f3b95\": container with ID starting with 9f6a7824480491d63eeb31dadb0586d47bc14f38493c0f47ef2d57a8792f3b95 not found: ID does not exist" containerID="9f6a7824480491d63eeb31dadb0586d47bc14f38493c0f47ef2d57a8792f3b95" Oct 07 19:46:04 crc kubenswrapper[4825]: I1007 19:46:04.612148 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f6a7824480491d63eeb31dadb0586d47bc14f38493c0f47ef2d57a8792f3b95"} err="failed to get container status \"9f6a7824480491d63eeb31dadb0586d47bc14f38493c0f47ef2d57a8792f3b95\": rpc error: code = NotFound desc = could not find container \"9f6a7824480491d63eeb31dadb0586d47bc14f38493c0f47ef2d57a8792f3b95\": container with ID starting with 9f6a7824480491d63eeb31dadb0586d47bc14f38493c0f47ef2d57a8792f3b95 not found: ID does not exist" Oct 07 19:46:04 crc kubenswrapper[4825]: I1007 19:46:04.612179 4825 scope.go:117] "RemoveContainer" containerID="cab9d09d438ef8485d0b8afef41416c287d3bff8fb458fe98bd4a8db16ee31c2" Oct 07 19:46:04 crc kubenswrapper[4825]: E1007 19:46:04.612531 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cab9d09d438ef8485d0b8afef41416c287d3bff8fb458fe98bd4a8db16ee31c2\": container with ID starting with cab9d09d438ef8485d0b8afef41416c287d3bff8fb458fe98bd4a8db16ee31c2 not found: ID does not exist" containerID="cab9d09d438ef8485d0b8afef41416c287d3bff8fb458fe98bd4a8db16ee31c2" Oct 07 19:46:04 crc kubenswrapper[4825]: I1007 19:46:04.612628 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cab9d09d438ef8485d0b8afef41416c287d3bff8fb458fe98bd4a8db16ee31c2"} err="failed to get container status \"cab9d09d438ef8485d0b8afef41416c287d3bff8fb458fe98bd4a8db16ee31c2\": rpc error: code = NotFound desc = could not find container \"cab9d09d438ef8485d0b8afef41416c287d3bff8fb458fe98bd4a8db16ee31c2\": container with ID starting with cab9d09d438ef8485d0b8afef41416c287d3bff8fb458fe98bd4a8db16ee31c2 not found: ID does not exist" Oct 07 19:46:04 crc kubenswrapper[4825]: I1007 19:46:04.612717 4825 scope.go:117] "RemoveContainer" containerID="ec4aa35f259bde980af05091d37f1ee8684a069ada35ffc419713f6b4263886d" Oct 07 19:46:04 crc kubenswrapper[4825]: E1007 19:46:04.613088 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec4aa35f259bde980af05091d37f1ee8684a069ada35ffc419713f6b4263886d\": container with ID starting with ec4aa35f259bde980af05091d37f1ee8684a069ada35ffc419713f6b4263886d not found: ID does not exist" containerID="ec4aa35f259bde980af05091d37f1ee8684a069ada35ffc419713f6b4263886d" Oct 07 19:46:04 crc kubenswrapper[4825]: I1007 19:46:04.613177 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec4aa35f259bde980af05091d37f1ee8684a069ada35ffc419713f6b4263886d"} err="failed to get container status \"ec4aa35f259bde980af05091d37f1ee8684a069ada35ffc419713f6b4263886d\": rpc error: code = NotFound desc = could not find container \"ec4aa35f259bde980af05091d37f1ee8684a069ada35ffc419713f6b4263886d\": container with ID starting with ec4aa35f259bde980af05091d37f1ee8684a069ada35ffc419713f6b4263886d not found: ID does not exist" Oct 07 19:46:05 crc kubenswrapper[4825]: I1007 19:46:05.810075 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40c6a7ea-d336-48ac-8029-674cf1c54b01" path="/var/lib/kubelet/pods/40c6a7ea-d336-48ac-8029-674cf1c54b01/volumes" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.299348 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 07 19:46:09 crc kubenswrapper[4825]: E1007 19:46:09.300327 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c6a7ea-d336-48ac-8029-674cf1c54b01" containerName="extract-utilities" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.300351 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c6a7ea-d336-48ac-8029-674cf1c54b01" containerName="extract-utilities" Oct 07 19:46:09 crc kubenswrapper[4825]: E1007 19:46:09.300373 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c6a7ea-d336-48ac-8029-674cf1c54b01" containerName="extract-content" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.300385 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c6a7ea-d336-48ac-8029-674cf1c54b01" containerName="extract-content" Oct 07 19:46:09 crc kubenswrapper[4825]: E1007 19:46:09.300437 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c6a7ea-d336-48ac-8029-674cf1c54b01" containerName="registry-server" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.300450 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c6a7ea-d336-48ac-8029-674cf1c54b01" containerName="registry-server" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.300835 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c6a7ea-d336-48ac-8029-674cf1c54b01" containerName="registry-server" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.301924 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.306930 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.307288 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.307907 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-ft7fd" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.314606 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.320079 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.403552 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjbt4\" (UniqueName: \"kubernetes.io/projected/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-kube-api-access-cjbt4\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.403629 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.403859 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.404038 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.404087 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.404125 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.404494 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-config-data\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.404614 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.404692 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.506686 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-config-data\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.506917 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.507011 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.507410 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjbt4\" (UniqueName: \"kubernetes.io/projected/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-kube-api-access-cjbt4\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.507568 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.507694 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.507758 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.507866 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.507983 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.508070 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.508377 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.508667 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.510175 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.510877 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-config-data\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.514842 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.516987 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.522797 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.545016 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjbt4\" (UniqueName: \"kubernetes.io/projected/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-kube-api-access-cjbt4\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.566287 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " pod="openstack/tempest-tests-tempest" Oct 07 19:46:09 crc kubenswrapper[4825]: I1007 19:46:09.644213 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 07 19:46:10 crc kubenswrapper[4825]: I1007 19:46:10.144031 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 07 19:46:10 crc kubenswrapper[4825]: I1007 19:46:10.551639 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"cf9823f2-5baf-49ba-9da5-a8f13ac66d75","Type":"ContainerStarted","Data":"83cd530afbd6ffc6a0ed355381c2097222402612c99714a0b2b2ac35d2c6e661"} Oct 07 19:46:37 crc kubenswrapper[4825]: E1007 19:46:37.184089 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 07 19:46:37 crc kubenswrapper[4825]: E1007 19:46:37.184941 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cjbt4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(cf9823f2-5baf-49ba-9da5-a8f13ac66d75): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 19:46:37 crc kubenswrapper[4825]: E1007 19:46:37.186705 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="cf9823f2-5baf-49ba-9da5-a8f13ac66d75" Oct 07 19:46:37 crc kubenswrapper[4825]: E1007 19:46:37.852774 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="cf9823f2-5baf-49ba-9da5-a8f13ac66d75" Oct 07 19:46:52 crc kubenswrapper[4825]: I1007 19:46:52.371106 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 07 19:46:54 crc kubenswrapper[4825]: I1007 19:46:54.059739 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"cf9823f2-5baf-49ba-9da5-a8f13ac66d75","Type":"ContainerStarted","Data":"d826273016649b3791f3a8a74afbcf053c3fb9bad627c31f8f3250eed148706e"} Oct 07 19:46:54 crc kubenswrapper[4825]: I1007 19:46:54.091526 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.873150893 podStartE2EDuration="46.09150176s" podCreationTimestamp="2025-10-07 19:46:08 +0000 UTC" firstStartedPulling="2025-10-07 19:46:10.149494568 +0000 UTC m=+2758.971533235" lastFinishedPulling="2025-10-07 19:46:52.367845465 +0000 UTC m=+2801.189884102" observedRunningTime="2025-10-07 19:46:54.084103304 +0000 UTC m=+2802.906142001" watchObservedRunningTime="2025-10-07 19:46:54.09150176 +0000 UTC m=+2802.913540437" Oct 07 19:47:05 crc kubenswrapper[4825]: I1007 19:47:05.708772 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:47:05 crc kubenswrapper[4825]: I1007 19:47:05.709550 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:47:35 crc kubenswrapper[4825]: I1007 19:47:35.709402 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:47:35 crc kubenswrapper[4825]: I1007 19:47:35.710331 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:48:05 crc kubenswrapper[4825]: I1007 19:48:05.709167 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:48:05 crc kubenswrapper[4825]: I1007 19:48:05.709792 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:48:05 crc kubenswrapper[4825]: I1007 19:48:05.709844 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:48:05 crc kubenswrapper[4825]: I1007 19:48:05.710678 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"57b89716c03c7599611d4dae6b8b92e7c7cf3c08e25e88a735fdb4005f3714e3"} pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 19:48:05 crc kubenswrapper[4825]: I1007 19:48:05.710737 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" containerID="cri-o://57b89716c03c7599611d4dae6b8b92e7c7cf3c08e25e88a735fdb4005f3714e3" gracePeriod=600 Oct 07 19:48:05 crc kubenswrapper[4825]: I1007 19:48:05.927109 4825 generic.go:334] "Generic (PLEG): container finished" podID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerID="57b89716c03c7599611d4dae6b8b92e7c7cf3c08e25e88a735fdb4005f3714e3" exitCode=0 Oct 07 19:48:05 crc kubenswrapper[4825]: I1007 19:48:05.927155 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerDied","Data":"57b89716c03c7599611d4dae6b8b92e7c7cf3c08e25e88a735fdb4005f3714e3"} Oct 07 19:48:05 crc kubenswrapper[4825]: I1007 19:48:05.927206 4825 scope.go:117] "RemoveContainer" containerID="4060db13b990db1850a2e490958d643862ec53a5d697dea08ecddea7dc31471d" Oct 07 19:48:06 crc kubenswrapper[4825]: I1007 19:48:06.943129 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerStarted","Data":"626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94"} Oct 07 19:50:35 crc kubenswrapper[4825]: I1007 19:50:35.708848 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:50:35 crc kubenswrapper[4825]: I1007 19:50:35.709465 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:51:05 crc kubenswrapper[4825]: I1007 19:51:05.709146 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:51:05 crc kubenswrapper[4825]: I1007 19:51:05.709746 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:51:35 crc kubenswrapper[4825]: I1007 19:51:35.708800 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:51:35 crc kubenswrapper[4825]: I1007 19:51:35.709387 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:51:35 crc kubenswrapper[4825]: I1007 19:51:35.709442 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 19:51:35 crc kubenswrapper[4825]: I1007 19:51:35.710285 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94"} pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 19:51:35 crc kubenswrapper[4825]: I1007 19:51:35.710349 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" containerID="cri-o://626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" gracePeriod=600 Oct 07 19:51:35 crc kubenswrapper[4825]: E1007 19:51:35.840959 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:51:36 crc kubenswrapper[4825]: I1007 19:51:36.176882 4825 generic.go:334] "Generic (PLEG): container finished" podID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" exitCode=0 Oct 07 19:51:36 crc kubenswrapper[4825]: I1007 19:51:36.176981 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerDied","Data":"626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94"} Oct 07 19:51:36 crc kubenswrapper[4825]: I1007 19:51:36.177472 4825 scope.go:117] "RemoveContainer" containerID="57b89716c03c7599611d4dae6b8b92e7c7cf3c08e25e88a735fdb4005f3714e3" Oct 07 19:51:36 crc kubenswrapper[4825]: I1007 19:51:36.178471 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:51:36 crc kubenswrapper[4825]: E1007 19:51:36.179009 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:51:51 crc kubenswrapper[4825]: I1007 19:51:51.809085 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:51:51 crc kubenswrapper[4825]: E1007 19:51:51.810018 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:52:05 crc kubenswrapper[4825]: I1007 19:52:05.795841 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:52:05 crc kubenswrapper[4825]: E1007 19:52:05.796886 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:52:19 crc kubenswrapper[4825]: I1007 19:52:19.795957 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:52:19 crc kubenswrapper[4825]: E1007 19:52:19.797540 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:52:30 crc kubenswrapper[4825]: I1007 19:52:30.796524 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:52:30 crc kubenswrapper[4825]: E1007 19:52:30.797780 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:52:42 crc kubenswrapper[4825]: I1007 19:52:42.795565 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:52:42 crc kubenswrapper[4825]: E1007 19:52:42.796260 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:52:53 crc kubenswrapper[4825]: I1007 19:52:53.544272 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9925x"] Oct 07 19:52:53 crc kubenswrapper[4825]: I1007 19:52:53.546512 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9925x" Oct 07 19:52:53 crc kubenswrapper[4825]: I1007 19:52:53.599082 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9925x"] Oct 07 19:52:53 crc kubenswrapper[4825]: I1007 19:52:53.638526 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f365b9-34c4-46d3-b795-ae65f4374362-utilities\") pod \"redhat-operators-9925x\" (UID: \"a0f365b9-34c4-46d3-b795-ae65f4374362\") " pod="openshift-marketplace/redhat-operators-9925x" Oct 07 19:52:53 crc kubenswrapper[4825]: I1007 19:52:53.638569 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f365b9-34c4-46d3-b795-ae65f4374362-catalog-content\") pod \"redhat-operators-9925x\" (UID: \"a0f365b9-34c4-46d3-b795-ae65f4374362\") " pod="openshift-marketplace/redhat-operators-9925x" Oct 07 19:52:53 crc kubenswrapper[4825]: I1007 19:52:53.638694 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrltp\" (UniqueName: \"kubernetes.io/projected/a0f365b9-34c4-46d3-b795-ae65f4374362-kube-api-access-zrltp\") pod \"redhat-operators-9925x\" (UID: \"a0f365b9-34c4-46d3-b795-ae65f4374362\") " pod="openshift-marketplace/redhat-operators-9925x" Oct 07 19:52:53 crc kubenswrapper[4825]: I1007 19:52:53.739941 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f365b9-34c4-46d3-b795-ae65f4374362-utilities\") pod \"redhat-operators-9925x\" (UID: \"a0f365b9-34c4-46d3-b795-ae65f4374362\") " pod="openshift-marketplace/redhat-operators-9925x" Oct 07 19:52:53 crc kubenswrapper[4825]: I1007 19:52:53.739988 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f365b9-34c4-46d3-b795-ae65f4374362-catalog-content\") pod \"redhat-operators-9925x\" (UID: \"a0f365b9-34c4-46d3-b795-ae65f4374362\") " pod="openshift-marketplace/redhat-operators-9925x" Oct 07 19:52:53 crc kubenswrapper[4825]: I1007 19:52:53.740129 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrltp\" (UniqueName: \"kubernetes.io/projected/a0f365b9-34c4-46d3-b795-ae65f4374362-kube-api-access-zrltp\") pod \"redhat-operators-9925x\" (UID: \"a0f365b9-34c4-46d3-b795-ae65f4374362\") " pod="openshift-marketplace/redhat-operators-9925x" Oct 07 19:52:53 crc kubenswrapper[4825]: I1007 19:52:53.740634 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f365b9-34c4-46d3-b795-ae65f4374362-utilities\") pod \"redhat-operators-9925x\" (UID: \"a0f365b9-34c4-46d3-b795-ae65f4374362\") " pod="openshift-marketplace/redhat-operators-9925x" Oct 07 19:52:53 crc kubenswrapper[4825]: I1007 19:52:53.740940 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f365b9-34c4-46d3-b795-ae65f4374362-catalog-content\") pod \"redhat-operators-9925x\" (UID: \"a0f365b9-34c4-46d3-b795-ae65f4374362\") " pod="openshift-marketplace/redhat-operators-9925x" Oct 07 19:52:53 crc kubenswrapper[4825]: I1007 19:52:53.765005 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrltp\" (UniqueName: \"kubernetes.io/projected/a0f365b9-34c4-46d3-b795-ae65f4374362-kube-api-access-zrltp\") pod \"redhat-operators-9925x\" (UID: \"a0f365b9-34c4-46d3-b795-ae65f4374362\") " pod="openshift-marketplace/redhat-operators-9925x" Oct 07 19:52:53 crc kubenswrapper[4825]: I1007 19:52:53.795105 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:52:53 crc kubenswrapper[4825]: E1007 19:52:53.795355 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:52:53 crc kubenswrapper[4825]: I1007 19:52:53.906903 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9925x" Oct 07 19:52:54 crc kubenswrapper[4825]: I1007 19:52:54.401686 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9925x"] Oct 07 19:52:55 crc kubenswrapper[4825]: I1007 19:52:55.070553 4825 generic.go:334] "Generic (PLEG): container finished" podID="a0f365b9-34c4-46d3-b795-ae65f4374362" containerID="3c36d46300a562fca60fdd9244bf4bd16bebd39651a35dee632170bbb244b7e5" exitCode=0 Oct 07 19:52:55 crc kubenswrapper[4825]: I1007 19:52:55.070646 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9925x" event={"ID":"a0f365b9-34c4-46d3-b795-ae65f4374362","Type":"ContainerDied","Data":"3c36d46300a562fca60fdd9244bf4bd16bebd39651a35dee632170bbb244b7e5"} Oct 07 19:52:55 crc kubenswrapper[4825]: I1007 19:52:55.070857 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9925x" event={"ID":"a0f365b9-34c4-46d3-b795-ae65f4374362","Type":"ContainerStarted","Data":"1c5067f86a18c742879539659eab2e02dcb3398d4babccc7e8d3bfbecf92f3dd"} Oct 07 19:52:55 crc kubenswrapper[4825]: I1007 19:52:55.072733 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 19:52:57 crc kubenswrapper[4825]: I1007 19:52:57.089317 4825 generic.go:334] "Generic (PLEG): container finished" podID="a0f365b9-34c4-46d3-b795-ae65f4374362" containerID="1ce460bc2d2e725c8af70c09fe058aff1b56fcc3bd774b4a283aa5010bb33c83" exitCode=0 Oct 07 19:52:57 crc kubenswrapper[4825]: I1007 19:52:57.089387 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9925x" event={"ID":"a0f365b9-34c4-46d3-b795-ae65f4374362","Type":"ContainerDied","Data":"1ce460bc2d2e725c8af70c09fe058aff1b56fcc3bd774b4a283aa5010bb33c83"} Oct 07 19:52:59 crc kubenswrapper[4825]: I1007 19:52:59.116824 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9925x" event={"ID":"a0f365b9-34c4-46d3-b795-ae65f4374362","Type":"ContainerStarted","Data":"5f96e0c65d76c79faacaf1234bb100515de6684e07bd341baa11ed93c5441fc9"} Oct 07 19:53:03 crc kubenswrapper[4825]: I1007 19:53:03.907053 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9925x" Oct 07 19:53:03 crc kubenswrapper[4825]: I1007 19:53:03.907661 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9925x" Oct 07 19:53:04 crc kubenswrapper[4825]: I1007 19:53:04.957021 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9925x" podUID="a0f365b9-34c4-46d3-b795-ae65f4374362" containerName="registry-server" probeResult="failure" output=< Oct 07 19:53:04 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Oct 07 19:53:04 crc kubenswrapper[4825]: > Oct 07 19:53:06 crc kubenswrapper[4825]: I1007 19:53:06.795687 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:53:06 crc kubenswrapper[4825]: E1007 19:53:06.796187 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:53:14 crc kubenswrapper[4825]: I1007 19:53:14.000986 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9925x" Oct 07 19:53:14 crc kubenswrapper[4825]: I1007 19:53:14.027728 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9925x" podStartSLOduration=18.194664754 podStartE2EDuration="21.027682972s" podCreationTimestamp="2025-10-07 19:52:53 +0000 UTC" firstStartedPulling="2025-10-07 19:52:55.072514502 +0000 UTC m=+3163.894553139" lastFinishedPulling="2025-10-07 19:52:57.90553271 +0000 UTC m=+3166.727571357" observedRunningTime="2025-10-07 19:52:59.149525014 +0000 UTC m=+3167.971563671" watchObservedRunningTime="2025-10-07 19:53:14.027682972 +0000 UTC m=+3182.849721620" Oct 07 19:53:14 crc kubenswrapper[4825]: I1007 19:53:14.076429 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9925x" Oct 07 19:53:14 crc kubenswrapper[4825]: I1007 19:53:14.260425 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9925x"] Oct 07 19:53:15 crc kubenswrapper[4825]: I1007 19:53:15.287814 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9925x" podUID="a0f365b9-34c4-46d3-b795-ae65f4374362" containerName="registry-server" containerID="cri-o://5f96e0c65d76c79faacaf1234bb100515de6684e07bd341baa11ed93c5441fc9" gracePeriod=2 Oct 07 19:53:15 crc kubenswrapper[4825]: I1007 19:53:15.830750 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9925x" Oct 07 19:53:15 crc kubenswrapper[4825]: I1007 19:53:15.869532 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f365b9-34c4-46d3-b795-ae65f4374362-catalog-content\") pod \"a0f365b9-34c4-46d3-b795-ae65f4374362\" (UID: \"a0f365b9-34c4-46d3-b795-ae65f4374362\") " Oct 07 19:53:15 crc kubenswrapper[4825]: I1007 19:53:15.869952 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f365b9-34c4-46d3-b795-ae65f4374362-utilities\") pod \"a0f365b9-34c4-46d3-b795-ae65f4374362\" (UID: \"a0f365b9-34c4-46d3-b795-ae65f4374362\") " Oct 07 19:53:15 crc kubenswrapper[4825]: I1007 19:53:15.870052 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrltp\" (UniqueName: \"kubernetes.io/projected/a0f365b9-34c4-46d3-b795-ae65f4374362-kube-api-access-zrltp\") pod \"a0f365b9-34c4-46d3-b795-ae65f4374362\" (UID: \"a0f365b9-34c4-46d3-b795-ae65f4374362\") " Oct 07 19:53:15 crc kubenswrapper[4825]: I1007 19:53:15.871367 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f365b9-34c4-46d3-b795-ae65f4374362-utilities" (OuterVolumeSpecName: "utilities") pod "a0f365b9-34c4-46d3-b795-ae65f4374362" (UID: "a0f365b9-34c4-46d3-b795-ae65f4374362"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:53:15 crc kubenswrapper[4825]: I1007 19:53:15.880163 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f365b9-34c4-46d3-b795-ae65f4374362-kube-api-access-zrltp" (OuterVolumeSpecName: "kube-api-access-zrltp") pod "a0f365b9-34c4-46d3-b795-ae65f4374362" (UID: "a0f365b9-34c4-46d3-b795-ae65f4374362"). InnerVolumeSpecName "kube-api-access-zrltp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:53:15 crc kubenswrapper[4825]: I1007 19:53:15.964895 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f365b9-34c4-46d3-b795-ae65f4374362-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0f365b9-34c4-46d3-b795-ae65f4374362" (UID: "a0f365b9-34c4-46d3-b795-ae65f4374362"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:53:15 crc kubenswrapper[4825]: I1007 19:53:15.972497 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f365b9-34c4-46d3-b795-ae65f4374362-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:53:15 crc kubenswrapper[4825]: I1007 19:53:15.972531 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f365b9-34c4-46d3-b795-ae65f4374362-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:53:15 crc kubenswrapper[4825]: I1007 19:53:15.972543 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrltp\" (UniqueName: \"kubernetes.io/projected/a0f365b9-34c4-46d3-b795-ae65f4374362-kube-api-access-zrltp\") on node \"crc\" DevicePath \"\"" Oct 07 19:53:16 crc kubenswrapper[4825]: I1007 19:53:16.300885 4825 generic.go:334] "Generic (PLEG): container finished" podID="a0f365b9-34c4-46d3-b795-ae65f4374362" containerID="5f96e0c65d76c79faacaf1234bb100515de6684e07bd341baa11ed93c5441fc9" exitCode=0 Oct 07 19:53:16 crc kubenswrapper[4825]: I1007 19:53:16.300942 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9925x" event={"ID":"a0f365b9-34c4-46d3-b795-ae65f4374362","Type":"ContainerDied","Data":"5f96e0c65d76c79faacaf1234bb100515de6684e07bd341baa11ed93c5441fc9"} Oct 07 19:53:16 crc kubenswrapper[4825]: I1007 19:53:16.300958 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9925x" Oct 07 19:53:16 crc kubenswrapper[4825]: I1007 19:53:16.300984 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9925x" event={"ID":"a0f365b9-34c4-46d3-b795-ae65f4374362","Type":"ContainerDied","Data":"1c5067f86a18c742879539659eab2e02dcb3398d4babccc7e8d3bfbecf92f3dd"} Oct 07 19:53:16 crc kubenswrapper[4825]: I1007 19:53:16.301012 4825 scope.go:117] "RemoveContainer" containerID="5f96e0c65d76c79faacaf1234bb100515de6684e07bd341baa11ed93c5441fc9" Oct 07 19:53:16 crc kubenswrapper[4825]: I1007 19:53:16.347519 4825 scope.go:117] "RemoveContainer" containerID="1ce460bc2d2e725c8af70c09fe058aff1b56fcc3bd774b4a283aa5010bb33c83" Oct 07 19:53:16 crc kubenswrapper[4825]: I1007 19:53:16.350719 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9925x"] Oct 07 19:53:16 crc kubenswrapper[4825]: I1007 19:53:16.362240 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9925x"] Oct 07 19:53:16 crc kubenswrapper[4825]: I1007 19:53:16.388111 4825 scope.go:117] "RemoveContainer" containerID="3c36d46300a562fca60fdd9244bf4bd16bebd39651a35dee632170bbb244b7e5" Oct 07 19:53:16 crc kubenswrapper[4825]: I1007 19:53:16.424564 4825 scope.go:117] "RemoveContainer" containerID="5f96e0c65d76c79faacaf1234bb100515de6684e07bd341baa11ed93c5441fc9" Oct 07 19:53:16 crc kubenswrapper[4825]: E1007 19:53:16.425121 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f96e0c65d76c79faacaf1234bb100515de6684e07bd341baa11ed93c5441fc9\": container with ID starting with 5f96e0c65d76c79faacaf1234bb100515de6684e07bd341baa11ed93c5441fc9 not found: ID does not exist" containerID="5f96e0c65d76c79faacaf1234bb100515de6684e07bd341baa11ed93c5441fc9" Oct 07 19:53:16 crc kubenswrapper[4825]: I1007 19:53:16.425167 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f96e0c65d76c79faacaf1234bb100515de6684e07bd341baa11ed93c5441fc9"} err="failed to get container status \"5f96e0c65d76c79faacaf1234bb100515de6684e07bd341baa11ed93c5441fc9\": rpc error: code = NotFound desc = could not find container \"5f96e0c65d76c79faacaf1234bb100515de6684e07bd341baa11ed93c5441fc9\": container with ID starting with 5f96e0c65d76c79faacaf1234bb100515de6684e07bd341baa11ed93c5441fc9 not found: ID does not exist" Oct 07 19:53:16 crc kubenswrapper[4825]: I1007 19:53:16.425199 4825 scope.go:117] "RemoveContainer" containerID="1ce460bc2d2e725c8af70c09fe058aff1b56fcc3bd774b4a283aa5010bb33c83" Oct 07 19:53:16 crc kubenswrapper[4825]: E1007 19:53:16.425750 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ce460bc2d2e725c8af70c09fe058aff1b56fcc3bd774b4a283aa5010bb33c83\": container with ID starting with 1ce460bc2d2e725c8af70c09fe058aff1b56fcc3bd774b4a283aa5010bb33c83 not found: ID does not exist" containerID="1ce460bc2d2e725c8af70c09fe058aff1b56fcc3bd774b4a283aa5010bb33c83" Oct 07 19:53:16 crc kubenswrapper[4825]: I1007 19:53:16.425791 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ce460bc2d2e725c8af70c09fe058aff1b56fcc3bd774b4a283aa5010bb33c83"} err="failed to get container status \"1ce460bc2d2e725c8af70c09fe058aff1b56fcc3bd774b4a283aa5010bb33c83\": rpc error: code = NotFound desc = could not find container \"1ce460bc2d2e725c8af70c09fe058aff1b56fcc3bd774b4a283aa5010bb33c83\": container with ID starting with 1ce460bc2d2e725c8af70c09fe058aff1b56fcc3bd774b4a283aa5010bb33c83 not found: ID does not exist" Oct 07 19:53:16 crc kubenswrapper[4825]: I1007 19:53:16.425821 4825 scope.go:117] "RemoveContainer" containerID="3c36d46300a562fca60fdd9244bf4bd16bebd39651a35dee632170bbb244b7e5" Oct 07 19:53:16 crc kubenswrapper[4825]: E1007 19:53:16.426193 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c36d46300a562fca60fdd9244bf4bd16bebd39651a35dee632170bbb244b7e5\": container with ID starting with 3c36d46300a562fca60fdd9244bf4bd16bebd39651a35dee632170bbb244b7e5 not found: ID does not exist" containerID="3c36d46300a562fca60fdd9244bf4bd16bebd39651a35dee632170bbb244b7e5" Oct 07 19:53:16 crc kubenswrapper[4825]: I1007 19:53:16.426213 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c36d46300a562fca60fdd9244bf4bd16bebd39651a35dee632170bbb244b7e5"} err="failed to get container status \"3c36d46300a562fca60fdd9244bf4bd16bebd39651a35dee632170bbb244b7e5\": rpc error: code = NotFound desc = could not find container \"3c36d46300a562fca60fdd9244bf4bd16bebd39651a35dee632170bbb244b7e5\": container with ID starting with 3c36d46300a562fca60fdd9244bf4bd16bebd39651a35dee632170bbb244b7e5 not found: ID does not exist" Oct 07 19:53:17 crc kubenswrapper[4825]: I1007 19:53:17.796195 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:53:17 crc kubenswrapper[4825]: E1007 19:53:17.797130 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:53:17 crc kubenswrapper[4825]: I1007 19:53:17.814803 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0f365b9-34c4-46d3-b795-ae65f4374362" path="/var/lib/kubelet/pods/a0f365b9-34c4-46d3-b795-ae65f4374362/volumes" Oct 07 19:53:28 crc kubenswrapper[4825]: I1007 19:53:28.795972 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:53:28 crc kubenswrapper[4825]: E1007 19:53:28.797413 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:53:40 crc kubenswrapper[4825]: I1007 19:53:40.795372 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:53:40 crc kubenswrapper[4825]: E1007 19:53:40.796319 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:53:51 crc kubenswrapper[4825]: I1007 19:53:51.809990 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:53:51 crc kubenswrapper[4825]: E1007 19:53:51.811476 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:54:05 crc kubenswrapper[4825]: I1007 19:54:05.796874 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:54:05 crc kubenswrapper[4825]: E1007 19:54:05.797863 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:54:16 crc kubenswrapper[4825]: I1007 19:54:16.799031 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:54:16 crc kubenswrapper[4825]: E1007 19:54:16.800678 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:54:30 crc kubenswrapper[4825]: I1007 19:54:30.795645 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:54:30 crc kubenswrapper[4825]: E1007 19:54:30.796439 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:54:42 crc kubenswrapper[4825]: I1007 19:54:42.795389 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:54:42 crc kubenswrapper[4825]: E1007 19:54:42.796115 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:54:57 crc kubenswrapper[4825]: I1007 19:54:57.796145 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:54:57 crc kubenswrapper[4825]: E1007 19:54:57.797214 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:55:10 crc kubenswrapper[4825]: I1007 19:55:10.625273 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zg42k"] Oct 07 19:55:10 crc kubenswrapper[4825]: E1007 19:55:10.626140 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f365b9-34c4-46d3-b795-ae65f4374362" containerName="extract-utilities" Oct 07 19:55:10 crc kubenswrapper[4825]: I1007 19:55:10.626154 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f365b9-34c4-46d3-b795-ae65f4374362" containerName="extract-utilities" Oct 07 19:55:10 crc kubenswrapper[4825]: E1007 19:55:10.626175 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f365b9-34c4-46d3-b795-ae65f4374362" containerName="registry-server" Oct 07 19:55:10 crc kubenswrapper[4825]: I1007 19:55:10.626182 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f365b9-34c4-46d3-b795-ae65f4374362" containerName="registry-server" Oct 07 19:55:10 crc kubenswrapper[4825]: E1007 19:55:10.626204 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f365b9-34c4-46d3-b795-ae65f4374362" containerName="extract-content" Oct 07 19:55:10 crc kubenswrapper[4825]: I1007 19:55:10.626211 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f365b9-34c4-46d3-b795-ae65f4374362" containerName="extract-content" Oct 07 19:55:10 crc kubenswrapper[4825]: I1007 19:55:10.626426 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f365b9-34c4-46d3-b795-ae65f4374362" containerName="registry-server" Oct 07 19:55:10 crc kubenswrapper[4825]: I1007 19:55:10.627692 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zg42k" Oct 07 19:55:10 crc kubenswrapper[4825]: I1007 19:55:10.636348 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zg42k"] Oct 07 19:55:10 crc kubenswrapper[4825]: I1007 19:55:10.766469 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a67fe05-3977-4f62-a914-bd41a6c59408-catalog-content\") pod \"certified-operators-zg42k\" (UID: \"9a67fe05-3977-4f62-a914-bd41a6c59408\") " pod="openshift-marketplace/certified-operators-zg42k" Oct 07 19:55:10 crc kubenswrapper[4825]: I1007 19:55:10.766563 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a67fe05-3977-4f62-a914-bd41a6c59408-utilities\") pod \"certified-operators-zg42k\" (UID: \"9a67fe05-3977-4f62-a914-bd41a6c59408\") " pod="openshift-marketplace/certified-operators-zg42k" Oct 07 19:55:10 crc kubenswrapper[4825]: I1007 19:55:10.766610 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr7p8\" (UniqueName: \"kubernetes.io/projected/9a67fe05-3977-4f62-a914-bd41a6c59408-kube-api-access-kr7p8\") pod \"certified-operators-zg42k\" (UID: \"9a67fe05-3977-4f62-a914-bd41a6c59408\") " pod="openshift-marketplace/certified-operators-zg42k" Oct 07 19:55:10 crc kubenswrapper[4825]: I1007 19:55:10.794909 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:55:10 crc kubenswrapper[4825]: E1007 19:55:10.795118 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:55:10 crc kubenswrapper[4825]: I1007 19:55:10.868686 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a67fe05-3977-4f62-a914-bd41a6c59408-catalog-content\") pod \"certified-operators-zg42k\" (UID: \"9a67fe05-3977-4f62-a914-bd41a6c59408\") " pod="openshift-marketplace/certified-operators-zg42k" Oct 07 19:55:10 crc kubenswrapper[4825]: I1007 19:55:10.868804 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a67fe05-3977-4f62-a914-bd41a6c59408-utilities\") pod \"certified-operators-zg42k\" (UID: \"9a67fe05-3977-4f62-a914-bd41a6c59408\") " pod="openshift-marketplace/certified-operators-zg42k" Oct 07 19:55:10 crc kubenswrapper[4825]: I1007 19:55:10.868857 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr7p8\" (UniqueName: \"kubernetes.io/projected/9a67fe05-3977-4f62-a914-bd41a6c59408-kube-api-access-kr7p8\") pod \"certified-operators-zg42k\" (UID: \"9a67fe05-3977-4f62-a914-bd41a6c59408\") " pod="openshift-marketplace/certified-operators-zg42k" Oct 07 19:55:10 crc kubenswrapper[4825]: I1007 19:55:10.869862 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a67fe05-3977-4f62-a914-bd41a6c59408-catalog-content\") pod \"certified-operators-zg42k\" (UID: \"9a67fe05-3977-4f62-a914-bd41a6c59408\") " pod="openshift-marketplace/certified-operators-zg42k" Oct 07 19:55:10 crc kubenswrapper[4825]: I1007 19:55:10.869893 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a67fe05-3977-4f62-a914-bd41a6c59408-utilities\") pod \"certified-operators-zg42k\" (UID: \"9a67fe05-3977-4f62-a914-bd41a6c59408\") " pod="openshift-marketplace/certified-operators-zg42k" Oct 07 19:55:10 crc kubenswrapper[4825]: I1007 19:55:10.894358 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr7p8\" (UniqueName: \"kubernetes.io/projected/9a67fe05-3977-4f62-a914-bd41a6c59408-kube-api-access-kr7p8\") pod \"certified-operators-zg42k\" (UID: \"9a67fe05-3977-4f62-a914-bd41a6c59408\") " pod="openshift-marketplace/certified-operators-zg42k" Oct 07 19:55:10 crc kubenswrapper[4825]: I1007 19:55:10.973494 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zg42k" Oct 07 19:55:11 crc kubenswrapper[4825]: I1007 19:55:11.522846 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zg42k"] Oct 07 19:55:11 crc kubenswrapper[4825]: I1007 19:55:11.571802 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zg42k" event={"ID":"9a67fe05-3977-4f62-a914-bd41a6c59408","Type":"ContainerStarted","Data":"8e8e664c1a6840e7c8f7054bfe66a148f4d979529d931bc33e43265f64350b9d"} Oct 07 19:55:12 crc kubenswrapper[4825]: I1007 19:55:12.588445 4825 generic.go:334] "Generic (PLEG): container finished" podID="9a67fe05-3977-4f62-a914-bd41a6c59408" containerID="7fcbabef25309613fc13e2a4386305458305fa057fab5403a0a1e487bc45dbd9" exitCode=0 Oct 07 19:55:12 crc kubenswrapper[4825]: I1007 19:55:12.588687 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zg42k" event={"ID":"9a67fe05-3977-4f62-a914-bd41a6c59408","Type":"ContainerDied","Data":"7fcbabef25309613fc13e2a4386305458305fa057fab5403a0a1e487bc45dbd9"} Oct 07 19:55:14 crc kubenswrapper[4825]: I1007 19:55:14.611433 4825 generic.go:334] "Generic (PLEG): container finished" podID="9a67fe05-3977-4f62-a914-bd41a6c59408" containerID="869b88e824ba8f33e97625ba803574509353955cc81ea7f20c1694a15b8787b6" exitCode=0 Oct 07 19:55:14 crc kubenswrapper[4825]: I1007 19:55:14.611487 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zg42k" event={"ID":"9a67fe05-3977-4f62-a914-bd41a6c59408","Type":"ContainerDied","Data":"869b88e824ba8f33e97625ba803574509353955cc81ea7f20c1694a15b8787b6"} Oct 07 19:55:15 crc kubenswrapper[4825]: I1007 19:55:15.621295 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zg42k" event={"ID":"9a67fe05-3977-4f62-a914-bd41a6c59408","Type":"ContainerStarted","Data":"04ebc56c10f3590a155069d6585650b4a709329b40cc7df38fff31cda98d0644"} Oct 07 19:55:15 crc kubenswrapper[4825]: I1007 19:55:15.637844 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zg42k" podStartSLOduration=3.154969431 podStartE2EDuration="5.637821601s" podCreationTimestamp="2025-10-07 19:55:10 +0000 UTC" firstStartedPulling="2025-10-07 19:55:12.591573861 +0000 UTC m=+3301.413612518" lastFinishedPulling="2025-10-07 19:55:15.074426041 +0000 UTC m=+3303.896464688" observedRunningTime="2025-10-07 19:55:15.634991721 +0000 UTC m=+3304.457030358" watchObservedRunningTime="2025-10-07 19:55:15.637821601 +0000 UTC m=+3304.459860238" Oct 07 19:55:20 crc kubenswrapper[4825]: I1007 19:55:20.974904 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zg42k" Oct 07 19:55:20 crc kubenswrapper[4825]: I1007 19:55:20.975402 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zg42k" Oct 07 19:55:21 crc kubenswrapper[4825]: I1007 19:55:21.022691 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zg42k" Oct 07 19:55:21 crc kubenswrapper[4825]: I1007 19:55:21.763135 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zg42k" Oct 07 19:55:21 crc kubenswrapper[4825]: I1007 19:55:21.842091 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zg42k"] Oct 07 19:55:23 crc kubenswrapper[4825]: I1007 19:55:23.698587 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zg42k" podUID="9a67fe05-3977-4f62-a914-bd41a6c59408" containerName="registry-server" containerID="cri-o://04ebc56c10f3590a155069d6585650b4a709329b40cc7df38fff31cda98d0644" gracePeriod=2 Oct 07 19:55:23 crc kubenswrapper[4825]: I1007 19:55:23.795512 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:55:23 crc kubenswrapper[4825]: E1007 19:55:23.796077 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.274293 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zg42k" Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.340684 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr7p8\" (UniqueName: \"kubernetes.io/projected/9a67fe05-3977-4f62-a914-bd41a6c59408-kube-api-access-kr7p8\") pod \"9a67fe05-3977-4f62-a914-bd41a6c59408\" (UID: \"9a67fe05-3977-4f62-a914-bd41a6c59408\") " Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.341614 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a67fe05-3977-4f62-a914-bd41a6c59408-utilities\") pod \"9a67fe05-3977-4f62-a914-bd41a6c59408\" (UID: \"9a67fe05-3977-4f62-a914-bd41a6c59408\") " Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.341867 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a67fe05-3977-4f62-a914-bd41a6c59408-catalog-content\") pod \"9a67fe05-3977-4f62-a914-bd41a6c59408\" (UID: \"9a67fe05-3977-4f62-a914-bd41a6c59408\") " Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.343547 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a67fe05-3977-4f62-a914-bd41a6c59408-utilities" (OuterVolumeSpecName: "utilities") pod "9a67fe05-3977-4f62-a914-bd41a6c59408" (UID: "9a67fe05-3977-4f62-a914-bd41a6c59408"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.347876 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a67fe05-3977-4f62-a914-bd41a6c59408-kube-api-access-kr7p8" (OuterVolumeSpecName: "kube-api-access-kr7p8") pod "9a67fe05-3977-4f62-a914-bd41a6c59408" (UID: "9a67fe05-3977-4f62-a914-bd41a6c59408"). InnerVolumeSpecName "kube-api-access-kr7p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.443335 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a67fe05-3977-4f62-a914-bd41a6c59408-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.443371 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr7p8\" (UniqueName: \"kubernetes.io/projected/9a67fe05-3977-4f62-a914-bd41a6c59408-kube-api-access-kr7p8\") on node \"crc\" DevicePath \"\"" Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.507802 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a67fe05-3977-4f62-a914-bd41a6c59408-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a67fe05-3977-4f62-a914-bd41a6c59408" (UID: "9a67fe05-3977-4f62-a914-bd41a6c59408"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.546214 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a67fe05-3977-4f62-a914-bd41a6c59408-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.720433 4825 generic.go:334] "Generic (PLEG): container finished" podID="9a67fe05-3977-4f62-a914-bd41a6c59408" containerID="04ebc56c10f3590a155069d6585650b4a709329b40cc7df38fff31cda98d0644" exitCode=0 Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.720479 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zg42k" event={"ID":"9a67fe05-3977-4f62-a914-bd41a6c59408","Type":"ContainerDied","Data":"04ebc56c10f3590a155069d6585650b4a709329b40cc7df38fff31cda98d0644"} Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.720553 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zg42k" event={"ID":"9a67fe05-3977-4f62-a914-bd41a6c59408","Type":"ContainerDied","Data":"8e8e664c1a6840e7c8f7054bfe66a148f4d979529d931bc33e43265f64350b9d"} Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.720629 4825 scope.go:117] "RemoveContainer" containerID="04ebc56c10f3590a155069d6585650b4a709329b40cc7df38fff31cda98d0644" Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.722767 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zg42k" Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.746521 4825 scope.go:117] "RemoveContainer" containerID="869b88e824ba8f33e97625ba803574509353955cc81ea7f20c1694a15b8787b6" Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.762610 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zg42k"] Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.771277 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zg42k"] Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.792207 4825 scope.go:117] "RemoveContainer" containerID="7fcbabef25309613fc13e2a4386305458305fa057fab5403a0a1e487bc45dbd9" Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.839939 4825 scope.go:117] "RemoveContainer" containerID="04ebc56c10f3590a155069d6585650b4a709329b40cc7df38fff31cda98d0644" Oct 07 19:55:24 crc kubenswrapper[4825]: E1007 19:55:24.840430 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04ebc56c10f3590a155069d6585650b4a709329b40cc7df38fff31cda98d0644\": container with ID starting with 04ebc56c10f3590a155069d6585650b4a709329b40cc7df38fff31cda98d0644 not found: ID does not exist" containerID="04ebc56c10f3590a155069d6585650b4a709329b40cc7df38fff31cda98d0644" Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.840463 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04ebc56c10f3590a155069d6585650b4a709329b40cc7df38fff31cda98d0644"} err="failed to get container status \"04ebc56c10f3590a155069d6585650b4a709329b40cc7df38fff31cda98d0644\": rpc error: code = NotFound desc = could not find container \"04ebc56c10f3590a155069d6585650b4a709329b40cc7df38fff31cda98d0644\": container with ID starting with 04ebc56c10f3590a155069d6585650b4a709329b40cc7df38fff31cda98d0644 not found: ID does not exist" Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.840485 4825 scope.go:117] "RemoveContainer" containerID="869b88e824ba8f33e97625ba803574509353955cc81ea7f20c1694a15b8787b6" Oct 07 19:55:24 crc kubenswrapper[4825]: E1007 19:55:24.840811 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"869b88e824ba8f33e97625ba803574509353955cc81ea7f20c1694a15b8787b6\": container with ID starting with 869b88e824ba8f33e97625ba803574509353955cc81ea7f20c1694a15b8787b6 not found: ID does not exist" containerID="869b88e824ba8f33e97625ba803574509353955cc81ea7f20c1694a15b8787b6" Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.840948 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"869b88e824ba8f33e97625ba803574509353955cc81ea7f20c1694a15b8787b6"} err="failed to get container status \"869b88e824ba8f33e97625ba803574509353955cc81ea7f20c1694a15b8787b6\": rpc error: code = NotFound desc = could not find container \"869b88e824ba8f33e97625ba803574509353955cc81ea7f20c1694a15b8787b6\": container with ID starting with 869b88e824ba8f33e97625ba803574509353955cc81ea7f20c1694a15b8787b6 not found: ID does not exist" Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.841061 4825 scope.go:117] "RemoveContainer" containerID="7fcbabef25309613fc13e2a4386305458305fa057fab5403a0a1e487bc45dbd9" Oct 07 19:55:24 crc kubenswrapper[4825]: E1007 19:55:24.841460 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fcbabef25309613fc13e2a4386305458305fa057fab5403a0a1e487bc45dbd9\": container with ID starting with 7fcbabef25309613fc13e2a4386305458305fa057fab5403a0a1e487bc45dbd9 not found: ID does not exist" containerID="7fcbabef25309613fc13e2a4386305458305fa057fab5403a0a1e487bc45dbd9" Oct 07 19:55:24 crc kubenswrapper[4825]: I1007 19:55:24.841485 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fcbabef25309613fc13e2a4386305458305fa057fab5403a0a1e487bc45dbd9"} err="failed to get container status \"7fcbabef25309613fc13e2a4386305458305fa057fab5403a0a1e487bc45dbd9\": rpc error: code = NotFound desc = could not find container \"7fcbabef25309613fc13e2a4386305458305fa057fab5403a0a1e487bc45dbd9\": container with ID starting with 7fcbabef25309613fc13e2a4386305458305fa057fab5403a0a1e487bc45dbd9 not found: ID does not exist" Oct 07 19:55:25 crc kubenswrapper[4825]: I1007 19:55:25.810530 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a67fe05-3977-4f62-a914-bd41a6c59408" path="/var/lib/kubelet/pods/9a67fe05-3977-4f62-a914-bd41a6c59408/volumes" Oct 07 19:55:36 crc kubenswrapper[4825]: I1007 19:55:36.758767 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-45p9b"] Oct 07 19:55:36 crc kubenswrapper[4825]: E1007 19:55:36.760262 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a67fe05-3977-4f62-a914-bd41a6c59408" containerName="extract-utilities" Oct 07 19:55:36 crc kubenswrapper[4825]: I1007 19:55:36.760291 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a67fe05-3977-4f62-a914-bd41a6c59408" containerName="extract-utilities" Oct 07 19:55:36 crc kubenswrapper[4825]: E1007 19:55:36.760325 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a67fe05-3977-4f62-a914-bd41a6c59408" containerName="registry-server" Oct 07 19:55:36 crc kubenswrapper[4825]: I1007 19:55:36.760341 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a67fe05-3977-4f62-a914-bd41a6c59408" containerName="registry-server" Oct 07 19:55:36 crc kubenswrapper[4825]: E1007 19:55:36.760368 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a67fe05-3977-4f62-a914-bd41a6c59408" containerName="extract-content" Oct 07 19:55:36 crc kubenswrapper[4825]: I1007 19:55:36.760380 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a67fe05-3977-4f62-a914-bd41a6c59408" containerName="extract-content" Oct 07 19:55:36 crc kubenswrapper[4825]: I1007 19:55:36.760798 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a67fe05-3977-4f62-a914-bd41a6c59408" containerName="registry-server" Oct 07 19:55:36 crc kubenswrapper[4825]: I1007 19:55:36.767807 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-45p9b" Oct 07 19:55:36 crc kubenswrapper[4825]: I1007 19:55:36.787089 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-45p9b"] Oct 07 19:55:36 crc kubenswrapper[4825]: I1007 19:55:36.808458 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e76f29-ba71-41c1-9be7-9a95f188d0a6-catalog-content\") pod \"redhat-marketplace-45p9b\" (UID: \"a5e76f29-ba71-41c1-9be7-9a95f188d0a6\") " pod="openshift-marketplace/redhat-marketplace-45p9b" Oct 07 19:55:36 crc kubenswrapper[4825]: I1007 19:55:36.808622 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e76f29-ba71-41c1-9be7-9a95f188d0a6-utilities\") pod \"redhat-marketplace-45p9b\" (UID: \"a5e76f29-ba71-41c1-9be7-9a95f188d0a6\") " pod="openshift-marketplace/redhat-marketplace-45p9b" Oct 07 19:55:36 crc kubenswrapper[4825]: I1007 19:55:36.808794 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22xc7\" (UniqueName: \"kubernetes.io/projected/a5e76f29-ba71-41c1-9be7-9a95f188d0a6-kube-api-access-22xc7\") pod \"redhat-marketplace-45p9b\" (UID: \"a5e76f29-ba71-41c1-9be7-9a95f188d0a6\") " pod="openshift-marketplace/redhat-marketplace-45p9b" Oct 07 19:55:36 crc kubenswrapper[4825]: I1007 19:55:36.910897 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e76f29-ba71-41c1-9be7-9a95f188d0a6-catalog-content\") pod \"redhat-marketplace-45p9b\" (UID: \"a5e76f29-ba71-41c1-9be7-9a95f188d0a6\") " pod="openshift-marketplace/redhat-marketplace-45p9b" Oct 07 19:55:36 crc kubenswrapper[4825]: I1007 19:55:36.910991 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e76f29-ba71-41c1-9be7-9a95f188d0a6-utilities\") pod \"redhat-marketplace-45p9b\" (UID: \"a5e76f29-ba71-41c1-9be7-9a95f188d0a6\") " pod="openshift-marketplace/redhat-marketplace-45p9b" Oct 07 19:55:36 crc kubenswrapper[4825]: I1007 19:55:36.911148 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22xc7\" (UniqueName: \"kubernetes.io/projected/a5e76f29-ba71-41c1-9be7-9a95f188d0a6-kube-api-access-22xc7\") pod \"redhat-marketplace-45p9b\" (UID: \"a5e76f29-ba71-41c1-9be7-9a95f188d0a6\") " pod="openshift-marketplace/redhat-marketplace-45p9b" Oct 07 19:55:36 crc kubenswrapper[4825]: I1007 19:55:36.911966 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e76f29-ba71-41c1-9be7-9a95f188d0a6-utilities\") pod \"redhat-marketplace-45p9b\" (UID: \"a5e76f29-ba71-41c1-9be7-9a95f188d0a6\") " pod="openshift-marketplace/redhat-marketplace-45p9b" Oct 07 19:55:36 crc kubenswrapper[4825]: I1007 19:55:36.912074 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e76f29-ba71-41c1-9be7-9a95f188d0a6-catalog-content\") pod \"redhat-marketplace-45p9b\" (UID: \"a5e76f29-ba71-41c1-9be7-9a95f188d0a6\") " pod="openshift-marketplace/redhat-marketplace-45p9b" Oct 07 19:55:36 crc kubenswrapper[4825]: I1007 19:55:36.941779 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22xc7\" (UniqueName: \"kubernetes.io/projected/a5e76f29-ba71-41c1-9be7-9a95f188d0a6-kube-api-access-22xc7\") pod \"redhat-marketplace-45p9b\" (UID: \"a5e76f29-ba71-41c1-9be7-9a95f188d0a6\") " pod="openshift-marketplace/redhat-marketplace-45p9b" Oct 07 19:55:37 crc kubenswrapper[4825]: I1007 19:55:37.100420 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-45p9b" Oct 07 19:55:37 crc kubenswrapper[4825]: I1007 19:55:37.615160 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-45p9b"] Oct 07 19:55:37 crc kubenswrapper[4825]: I1007 19:55:37.795833 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:55:37 crc kubenswrapper[4825]: E1007 19:55:37.796491 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:55:37 crc kubenswrapper[4825]: I1007 19:55:37.864886 4825 generic.go:334] "Generic (PLEG): container finished" podID="a5e76f29-ba71-41c1-9be7-9a95f188d0a6" containerID="aa830ba70b13a5f14053588ba91e1036631b056eeff67e8351c992543ab75a40" exitCode=0 Oct 07 19:55:37 crc kubenswrapper[4825]: I1007 19:55:37.864954 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-45p9b" event={"ID":"a5e76f29-ba71-41c1-9be7-9a95f188d0a6","Type":"ContainerDied","Data":"aa830ba70b13a5f14053588ba91e1036631b056eeff67e8351c992543ab75a40"} Oct 07 19:55:37 crc kubenswrapper[4825]: I1007 19:55:37.864998 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-45p9b" event={"ID":"a5e76f29-ba71-41c1-9be7-9a95f188d0a6","Type":"ContainerStarted","Data":"512c268c8171d7c516b5bfdc13e7f4f2c50e0ae206facf8036dbc461c798a50f"} Oct 07 19:55:38 crc kubenswrapper[4825]: I1007 19:55:38.883945 4825 generic.go:334] "Generic (PLEG): container finished" podID="a5e76f29-ba71-41c1-9be7-9a95f188d0a6" containerID="3087c7e730b8c8a15e417123bcb4f17597c850f18c6235bd5996d7fd968482bd" exitCode=0 Oct 07 19:55:38 crc kubenswrapper[4825]: I1007 19:55:38.884512 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-45p9b" event={"ID":"a5e76f29-ba71-41c1-9be7-9a95f188d0a6","Type":"ContainerDied","Data":"3087c7e730b8c8a15e417123bcb4f17597c850f18c6235bd5996d7fd968482bd"} Oct 07 19:55:39 crc kubenswrapper[4825]: I1007 19:55:39.902620 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-45p9b" event={"ID":"a5e76f29-ba71-41c1-9be7-9a95f188d0a6","Type":"ContainerStarted","Data":"5b9fbc37909632f6ac303f0089dde00c17d52a3bce8f37c63bcba85020a480aa"} Oct 07 19:55:39 crc kubenswrapper[4825]: I1007 19:55:39.924613 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-45p9b" podStartSLOduration=2.351001591 podStartE2EDuration="3.924596188s" podCreationTimestamp="2025-10-07 19:55:36 +0000 UTC" firstStartedPulling="2025-10-07 19:55:37.867540908 +0000 UTC m=+3326.689579545" lastFinishedPulling="2025-10-07 19:55:39.441135505 +0000 UTC m=+3328.263174142" observedRunningTime="2025-10-07 19:55:39.922794552 +0000 UTC m=+3328.744833189" watchObservedRunningTime="2025-10-07 19:55:39.924596188 +0000 UTC m=+3328.746634835" Oct 07 19:55:47 crc kubenswrapper[4825]: I1007 19:55:47.101299 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-45p9b" Oct 07 19:55:47 crc kubenswrapper[4825]: I1007 19:55:47.101878 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-45p9b" Oct 07 19:55:47 crc kubenswrapper[4825]: I1007 19:55:47.153211 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-45p9b" Oct 07 19:55:48 crc kubenswrapper[4825]: I1007 19:55:48.095697 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-45p9b" Oct 07 19:55:48 crc kubenswrapper[4825]: I1007 19:55:48.161724 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-45p9b"] Oct 07 19:55:49 crc kubenswrapper[4825]: I1007 19:55:49.795086 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:55:49 crc kubenswrapper[4825]: E1007 19:55:49.795800 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:55:50 crc kubenswrapper[4825]: I1007 19:55:50.041748 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-45p9b" podUID="a5e76f29-ba71-41c1-9be7-9a95f188d0a6" containerName="registry-server" containerID="cri-o://5b9fbc37909632f6ac303f0089dde00c17d52a3bce8f37c63bcba85020a480aa" gracePeriod=2 Oct 07 19:55:50 crc kubenswrapper[4825]: I1007 19:55:50.592374 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-45p9b" Oct 07 19:55:50 crc kubenswrapper[4825]: I1007 19:55:50.694892 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e76f29-ba71-41c1-9be7-9a95f188d0a6-utilities\") pod \"a5e76f29-ba71-41c1-9be7-9a95f188d0a6\" (UID: \"a5e76f29-ba71-41c1-9be7-9a95f188d0a6\") " Oct 07 19:55:50 crc kubenswrapper[4825]: I1007 19:55:50.694961 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22xc7\" (UniqueName: \"kubernetes.io/projected/a5e76f29-ba71-41c1-9be7-9a95f188d0a6-kube-api-access-22xc7\") pod \"a5e76f29-ba71-41c1-9be7-9a95f188d0a6\" (UID: \"a5e76f29-ba71-41c1-9be7-9a95f188d0a6\") " Oct 07 19:55:50 crc kubenswrapper[4825]: I1007 19:55:50.695147 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e76f29-ba71-41c1-9be7-9a95f188d0a6-catalog-content\") pod \"a5e76f29-ba71-41c1-9be7-9a95f188d0a6\" (UID: \"a5e76f29-ba71-41c1-9be7-9a95f188d0a6\") " Oct 07 19:55:50 crc kubenswrapper[4825]: I1007 19:55:50.695728 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5e76f29-ba71-41c1-9be7-9a95f188d0a6-utilities" (OuterVolumeSpecName: "utilities") pod "a5e76f29-ba71-41c1-9be7-9a95f188d0a6" (UID: "a5e76f29-ba71-41c1-9be7-9a95f188d0a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:55:50 crc kubenswrapper[4825]: I1007 19:55:50.700300 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5e76f29-ba71-41c1-9be7-9a95f188d0a6-kube-api-access-22xc7" (OuterVolumeSpecName: "kube-api-access-22xc7") pod "a5e76f29-ba71-41c1-9be7-9a95f188d0a6" (UID: "a5e76f29-ba71-41c1-9be7-9a95f188d0a6"). InnerVolumeSpecName "kube-api-access-22xc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:55:50 crc kubenswrapper[4825]: I1007 19:55:50.707881 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5e76f29-ba71-41c1-9be7-9a95f188d0a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5e76f29-ba71-41c1-9be7-9a95f188d0a6" (UID: "a5e76f29-ba71-41c1-9be7-9a95f188d0a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:55:50 crc kubenswrapper[4825]: I1007 19:55:50.796818 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e76f29-ba71-41c1-9be7-9a95f188d0a6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:55:50 crc kubenswrapper[4825]: I1007 19:55:50.796849 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e76f29-ba71-41c1-9be7-9a95f188d0a6-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:55:50 crc kubenswrapper[4825]: I1007 19:55:50.796859 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22xc7\" (UniqueName: \"kubernetes.io/projected/a5e76f29-ba71-41c1-9be7-9a95f188d0a6-kube-api-access-22xc7\") on node \"crc\" DevicePath \"\"" Oct 07 19:55:51 crc kubenswrapper[4825]: I1007 19:55:51.052128 4825 generic.go:334] "Generic (PLEG): container finished" podID="a5e76f29-ba71-41c1-9be7-9a95f188d0a6" containerID="5b9fbc37909632f6ac303f0089dde00c17d52a3bce8f37c63bcba85020a480aa" exitCode=0 Oct 07 19:55:51 crc kubenswrapper[4825]: I1007 19:55:51.052175 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-45p9b" Oct 07 19:55:51 crc kubenswrapper[4825]: I1007 19:55:51.052191 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-45p9b" event={"ID":"a5e76f29-ba71-41c1-9be7-9a95f188d0a6","Type":"ContainerDied","Data":"5b9fbc37909632f6ac303f0089dde00c17d52a3bce8f37c63bcba85020a480aa"} Oct 07 19:55:51 crc kubenswrapper[4825]: I1007 19:55:51.052573 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-45p9b" event={"ID":"a5e76f29-ba71-41c1-9be7-9a95f188d0a6","Type":"ContainerDied","Data":"512c268c8171d7c516b5bfdc13e7f4f2c50e0ae206facf8036dbc461c798a50f"} Oct 07 19:55:51 crc kubenswrapper[4825]: I1007 19:55:51.052593 4825 scope.go:117] "RemoveContainer" containerID="5b9fbc37909632f6ac303f0089dde00c17d52a3bce8f37c63bcba85020a480aa" Oct 07 19:55:51 crc kubenswrapper[4825]: I1007 19:55:51.086898 4825 scope.go:117] "RemoveContainer" containerID="3087c7e730b8c8a15e417123bcb4f17597c850f18c6235bd5996d7fd968482bd" Oct 07 19:55:51 crc kubenswrapper[4825]: I1007 19:55:51.087811 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-45p9b"] Oct 07 19:55:51 crc kubenswrapper[4825]: I1007 19:55:51.095835 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-45p9b"] Oct 07 19:55:51 crc kubenswrapper[4825]: I1007 19:55:51.119471 4825 scope.go:117] "RemoveContainer" containerID="aa830ba70b13a5f14053588ba91e1036631b056eeff67e8351c992543ab75a40" Oct 07 19:55:51 crc kubenswrapper[4825]: I1007 19:55:51.180255 4825 scope.go:117] "RemoveContainer" containerID="5b9fbc37909632f6ac303f0089dde00c17d52a3bce8f37c63bcba85020a480aa" Oct 07 19:55:51 crc kubenswrapper[4825]: E1007 19:55:51.180879 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b9fbc37909632f6ac303f0089dde00c17d52a3bce8f37c63bcba85020a480aa\": container with ID starting with 5b9fbc37909632f6ac303f0089dde00c17d52a3bce8f37c63bcba85020a480aa not found: ID does not exist" containerID="5b9fbc37909632f6ac303f0089dde00c17d52a3bce8f37c63bcba85020a480aa" Oct 07 19:55:51 crc kubenswrapper[4825]: I1007 19:55:51.180948 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b9fbc37909632f6ac303f0089dde00c17d52a3bce8f37c63bcba85020a480aa"} err="failed to get container status \"5b9fbc37909632f6ac303f0089dde00c17d52a3bce8f37c63bcba85020a480aa\": rpc error: code = NotFound desc = could not find container \"5b9fbc37909632f6ac303f0089dde00c17d52a3bce8f37c63bcba85020a480aa\": container with ID starting with 5b9fbc37909632f6ac303f0089dde00c17d52a3bce8f37c63bcba85020a480aa not found: ID does not exist" Oct 07 19:55:51 crc kubenswrapper[4825]: I1007 19:55:51.180976 4825 scope.go:117] "RemoveContainer" containerID="3087c7e730b8c8a15e417123bcb4f17597c850f18c6235bd5996d7fd968482bd" Oct 07 19:55:51 crc kubenswrapper[4825]: E1007 19:55:51.181544 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3087c7e730b8c8a15e417123bcb4f17597c850f18c6235bd5996d7fd968482bd\": container with ID starting with 3087c7e730b8c8a15e417123bcb4f17597c850f18c6235bd5996d7fd968482bd not found: ID does not exist" containerID="3087c7e730b8c8a15e417123bcb4f17597c850f18c6235bd5996d7fd968482bd" Oct 07 19:55:51 crc kubenswrapper[4825]: I1007 19:55:51.181597 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3087c7e730b8c8a15e417123bcb4f17597c850f18c6235bd5996d7fd968482bd"} err="failed to get container status \"3087c7e730b8c8a15e417123bcb4f17597c850f18c6235bd5996d7fd968482bd\": rpc error: code = NotFound desc = could not find container \"3087c7e730b8c8a15e417123bcb4f17597c850f18c6235bd5996d7fd968482bd\": container with ID starting with 3087c7e730b8c8a15e417123bcb4f17597c850f18c6235bd5996d7fd968482bd not found: ID does not exist" Oct 07 19:55:51 crc kubenswrapper[4825]: I1007 19:55:51.181634 4825 scope.go:117] "RemoveContainer" containerID="aa830ba70b13a5f14053588ba91e1036631b056eeff67e8351c992543ab75a40" Oct 07 19:55:51 crc kubenswrapper[4825]: E1007 19:55:51.182143 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa830ba70b13a5f14053588ba91e1036631b056eeff67e8351c992543ab75a40\": container with ID starting with aa830ba70b13a5f14053588ba91e1036631b056eeff67e8351c992543ab75a40 not found: ID does not exist" containerID="aa830ba70b13a5f14053588ba91e1036631b056eeff67e8351c992543ab75a40" Oct 07 19:55:51 crc kubenswrapper[4825]: I1007 19:55:51.182195 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa830ba70b13a5f14053588ba91e1036631b056eeff67e8351c992543ab75a40"} err="failed to get container status \"aa830ba70b13a5f14053588ba91e1036631b056eeff67e8351c992543ab75a40\": rpc error: code = NotFound desc = could not find container \"aa830ba70b13a5f14053588ba91e1036631b056eeff67e8351c992543ab75a40\": container with ID starting with aa830ba70b13a5f14053588ba91e1036631b056eeff67e8351c992543ab75a40 not found: ID does not exist" Oct 07 19:55:51 crc kubenswrapper[4825]: I1007 19:55:51.814496 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5e76f29-ba71-41c1-9be7-9a95f188d0a6" path="/var/lib/kubelet/pods/a5e76f29-ba71-41c1-9be7-9a95f188d0a6/volumes" Oct 07 19:56:01 crc kubenswrapper[4825]: I1007 19:56:01.801878 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:56:01 crc kubenswrapper[4825]: E1007 19:56:01.802668 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:56:10 crc kubenswrapper[4825]: I1007 19:56:10.617011 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2g7bm"] Oct 07 19:56:10 crc kubenswrapper[4825]: E1007 19:56:10.618396 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e76f29-ba71-41c1-9be7-9a95f188d0a6" containerName="extract-content" Oct 07 19:56:10 crc kubenswrapper[4825]: I1007 19:56:10.618421 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e76f29-ba71-41c1-9be7-9a95f188d0a6" containerName="extract-content" Oct 07 19:56:10 crc kubenswrapper[4825]: E1007 19:56:10.618448 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e76f29-ba71-41c1-9be7-9a95f188d0a6" containerName="registry-server" Oct 07 19:56:10 crc kubenswrapper[4825]: I1007 19:56:10.618460 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e76f29-ba71-41c1-9be7-9a95f188d0a6" containerName="registry-server" Oct 07 19:56:10 crc kubenswrapper[4825]: E1007 19:56:10.618488 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e76f29-ba71-41c1-9be7-9a95f188d0a6" containerName="extract-utilities" Oct 07 19:56:10 crc kubenswrapper[4825]: I1007 19:56:10.618502 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e76f29-ba71-41c1-9be7-9a95f188d0a6" containerName="extract-utilities" Oct 07 19:56:10 crc kubenswrapper[4825]: I1007 19:56:10.618897 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5e76f29-ba71-41c1-9be7-9a95f188d0a6" containerName="registry-server" Oct 07 19:56:10 crc kubenswrapper[4825]: I1007 19:56:10.621757 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2g7bm" Oct 07 19:56:10 crc kubenswrapper[4825]: I1007 19:56:10.633439 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2g7bm"] Oct 07 19:56:10 crc kubenswrapper[4825]: I1007 19:56:10.756146 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e96c7d8-aade-46a0-af47-f1c84699e947-catalog-content\") pod \"community-operators-2g7bm\" (UID: \"3e96c7d8-aade-46a0-af47-f1c84699e947\") " pod="openshift-marketplace/community-operators-2g7bm" Oct 07 19:56:10 crc kubenswrapper[4825]: I1007 19:56:10.756210 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e96c7d8-aade-46a0-af47-f1c84699e947-utilities\") pod \"community-operators-2g7bm\" (UID: \"3e96c7d8-aade-46a0-af47-f1c84699e947\") " pod="openshift-marketplace/community-operators-2g7bm" Oct 07 19:56:10 crc kubenswrapper[4825]: I1007 19:56:10.756396 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzwtk\" (UniqueName: \"kubernetes.io/projected/3e96c7d8-aade-46a0-af47-f1c84699e947-kube-api-access-xzwtk\") pod \"community-operators-2g7bm\" (UID: \"3e96c7d8-aade-46a0-af47-f1c84699e947\") " pod="openshift-marketplace/community-operators-2g7bm" Oct 07 19:56:10 crc kubenswrapper[4825]: I1007 19:56:10.857448 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e96c7d8-aade-46a0-af47-f1c84699e947-catalog-content\") pod \"community-operators-2g7bm\" (UID: \"3e96c7d8-aade-46a0-af47-f1c84699e947\") " pod="openshift-marketplace/community-operators-2g7bm" Oct 07 19:56:10 crc kubenswrapper[4825]: I1007 19:56:10.857524 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e96c7d8-aade-46a0-af47-f1c84699e947-utilities\") pod \"community-operators-2g7bm\" (UID: \"3e96c7d8-aade-46a0-af47-f1c84699e947\") " pod="openshift-marketplace/community-operators-2g7bm" Oct 07 19:56:10 crc kubenswrapper[4825]: I1007 19:56:10.858326 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e96c7d8-aade-46a0-af47-f1c84699e947-utilities\") pod \"community-operators-2g7bm\" (UID: \"3e96c7d8-aade-46a0-af47-f1c84699e947\") " pod="openshift-marketplace/community-operators-2g7bm" Oct 07 19:56:10 crc kubenswrapper[4825]: I1007 19:56:10.858410 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e96c7d8-aade-46a0-af47-f1c84699e947-catalog-content\") pod \"community-operators-2g7bm\" (UID: \"3e96c7d8-aade-46a0-af47-f1c84699e947\") " pod="openshift-marketplace/community-operators-2g7bm" Oct 07 19:56:10 crc kubenswrapper[4825]: I1007 19:56:10.858555 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzwtk\" (UniqueName: \"kubernetes.io/projected/3e96c7d8-aade-46a0-af47-f1c84699e947-kube-api-access-xzwtk\") pod \"community-operators-2g7bm\" (UID: \"3e96c7d8-aade-46a0-af47-f1c84699e947\") " pod="openshift-marketplace/community-operators-2g7bm" Oct 07 19:56:10 crc kubenswrapper[4825]: I1007 19:56:10.888285 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzwtk\" (UniqueName: \"kubernetes.io/projected/3e96c7d8-aade-46a0-af47-f1c84699e947-kube-api-access-xzwtk\") pod \"community-operators-2g7bm\" (UID: \"3e96c7d8-aade-46a0-af47-f1c84699e947\") " pod="openshift-marketplace/community-operators-2g7bm" Oct 07 19:56:10 crc kubenswrapper[4825]: I1007 19:56:10.948310 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2g7bm" Oct 07 19:56:11 crc kubenswrapper[4825]: I1007 19:56:11.568295 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2g7bm"] Oct 07 19:56:12 crc kubenswrapper[4825]: I1007 19:56:12.306029 4825 generic.go:334] "Generic (PLEG): container finished" podID="3e96c7d8-aade-46a0-af47-f1c84699e947" containerID="19fc6a179f4d658473680cc985d6a9b1fd2d79c4ff3ed4eefa73c83366fd62e1" exitCode=0 Oct 07 19:56:12 crc kubenswrapper[4825]: I1007 19:56:12.306095 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2g7bm" event={"ID":"3e96c7d8-aade-46a0-af47-f1c84699e947","Type":"ContainerDied","Data":"19fc6a179f4d658473680cc985d6a9b1fd2d79c4ff3ed4eefa73c83366fd62e1"} Oct 07 19:56:12 crc kubenswrapper[4825]: I1007 19:56:12.306483 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2g7bm" event={"ID":"3e96c7d8-aade-46a0-af47-f1c84699e947","Type":"ContainerStarted","Data":"03583a634fa746c1e18c1b41348db7f2b74e2c63ebb7cacb0f9c29b33fa86ced"} Oct 07 19:56:12 crc kubenswrapper[4825]: I1007 19:56:12.796031 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:56:12 crc kubenswrapper[4825]: E1007 19:56:12.796415 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:56:14 crc kubenswrapper[4825]: I1007 19:56:14.324877 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2g7bm" event={"ID":"3e96c7d8-aade-46a0-af47-f1c84699e947","Type":"ContainerStarted","Data":"b43992c0c00e81d03ad0bfac65cb8e6417c0ce3ca19fe5507499765ea1320843"} Oct 07 19:56:16 crc kubenswrapper[4825]: I1007 19:56:16.351804 4825 generic.go:334] "Generic (PLEG): container finished" podID="3e96c7d8-aade-46a0-af47-f1c84699e947" containerID="b43992c0c00e81d03ad0bfac65cb8e6417c0ce3ca19fe5507499765ea1320843" exitCode=0 Oct 07 19:56:16 crc kubenswrapper[4825]: I1007 19:56:16.351857 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2g7bm" event={"ID":"3e96c7d8-aade-46a0-af47-f1c84699e947","Type":"ContainerDied","Data":"b43992c0c00e81d03ad0bfac65cb8e6417c0ce3ca19fe5507499765ea1320843"} Oct 07 19:56:17 crc kubenswrapper[4825]: I1007 19:56:17.370048 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2g7bm" event={"ID":"3e96c7d8-aade-46a0-af47-f1c84699e947","Type":"ContainerStarted","Data":"83e182613d2703afdd182aa2f39c6f72365e70af2ccbd781523c958879d281c0"} Oct 07 19:56:17 crc kubenswrapper[4825]: I1007 19:56:17.394162 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2g7bm" podStartSLOduration=2.864464107 podStartE2EDuration="7.394140736s" podCreationTimestamp="2025-10-07 19:56:10 +0000 UTC" firstStartedPulling="2025-10-07 19:56:12.308182762 +0000 UTC m=+3361.130221399" lastFinishedPulling="2025-10-07 19:56:16.837859391 +0000 UTC m=+3365.659898028" observedRunningTime="2025-10-07 19:56:17.391703609 +0000 UTC m=+3366.213742316" watchObservedRunningTime="2025-10-07 19:56:17.394140736 +0000 UTC m=+3366.216179393" Oct 07 19:56:20 crc kubenswrapper[4825]: I1007 19:56:20.952606 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2g7bm" Oct 07 19:56:20 crc kubenswrapper[4825]: I1007 19:56:20.953053 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2g7bm" Oct 07 19:56:21 crc kubenswrapper[4825]: I1007 19:56:21.007214 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2g7bm" Oct 07 19:56:25 crc kubenswrapper[4825]: I1007 19:56:25.795703 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:56:25 crc kubenswrapper[4825]: E1007 19:56:25.796422 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 19:56:31 crc kubenswrapper[4825]: I1007 19:56:31.006123 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2g7bm" Oct 07 19:56:31 crc kubenswrapper[4825]: I1007 19:56:31.052035 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2g7bm"] Oct 07 19:56:31 crc kubenswrapper[4825]: I1007 19:56:31.526910 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2g7bm" podUID="3e96c7d8-aade-46a0-af47-f1c84699e947" containerName="registry-server" containerID="cri-o://83e182613d2703afdd182aa2f39c6f72365e70af2ccbd781523c958879d281c0" gracePeriod=2 Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.093471 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2g7bm" Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.223811 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e96c7d8-aade-46a0-af47-f1c84699e947-catalog-content\") pod \"3e96c7d8-aade-46a0-af47-f1c84699e947\" (UID: \"3e96c7d8-aade-46a0-af47-f1c84699e947\") " Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.224186 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzwtk\" (UniqueName: \"kubernetes.io/projected/3e96c7d8-aade-46a0-af47-f1c84699e947-kube-api-access-xzwtk\") pod \"3e96c7d8-aade-46a0-af47-f1c84699e947\" (UID: \"3e96c7d8-aade-46a0-af47-f1c84699e947\") " Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.224367 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e96c7d8-aade-46a0-af47-f1c84699e947-utilities\") pod \"3e96c7d8-aade-46a0-af47-f1c84699e947\" (UID: \"3e96c7d8-aade-46a0-af47-f1c84699e947\") " Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.226076 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e96c7d8-aade-46a0-af47-f1c84699e947-utilities" (OuterVolumeSpecName: "utilities") pod "3e96c7d8-aade-46a0-af47-f1c84699e947" (UID: "3e96c7d8-aade-46a0-af47-f1c84699e947"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.238562 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e96c7d8-aade-46a0-af47-f1c84699e947-kube-api-access-xzwtk" (OuterVolumeSpecName: "kube-api-access-xzwtk") pod "3e96c7d8-aade-46a0-af47-f1c84699e947" (UID: "3e96c7d8-aade-46a0-af47-f1c84699e947"). InnerVolumeSpecName "kube-api-access-xzwtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.298240 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e96c7d8-aade-46a0-af47-f1c84699e947-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e96c7d8-aade-46a0-af47-f1c84699e947" (UID: "3e96c7d8-aade-46a0-af47-f1c84699e947"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.326799 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzwtk\" (UniqueName: \"kubernetes.io/projected/3e96c7d8-aade-46a0-af47-f1c84699e947-kube-api-access-xzwtk\") on node \"crc\" DevicePath \"\"" Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.327061 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e96c7d8-aade-46a0-af47-f1c84699e947-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.327167 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e96c7d8-aade-46a0-af47-f1c84699e947-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.543903 4825 generic.go:334] "Generic (PLEG): container finished" podID="3e96c7d8-aade-46a0-af47-f1c84699e947" containerID="83e182613d2703afdd182aa2f39c6f72365e70af2ccbd781523c958879d281c0" exitCode=0 Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.543983 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2g7bm" event={"ID":"3e96c7d8-aade-46a0-af47-f1c84699e947","Type":"ContainerDied","Data":"83e182613d2703afdd182aa2f39c6f72365e70af2ccbd781523c958879d281c0"} Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.544032 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2g7bm" event={"ID":"3e96c7d8-aade-46a0-af47-f1c84699e947","Type":"ContainerDied","Data":"03583a634fa746c1e18c1b41348db7f2b74e2c63ebb7cacb0f9c29b33fa86ced"} Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.544062 4825 scope.go:117] "RemoveContainer" containerID="83e182613d2703afdd182aa2f39c6f72365e70af2ccbd781523c958879d281c0" Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.544299 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2g7bm" Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.593345 4825 scope.go:117] "RemoveContainer" containerID="b43992c0c00e81d03ad0bfac65cb8e6417c0ce3ca19fe5507499765ea1320843" Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.608091 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2g7bm"] Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.623070 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2g7bm"] Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.630380 4825 scope.go:117] "RemoveContainer" containerID="19fc6a179f4d658473680cc985d6a9b1fd2d79c4ff3ed4eefa73c83366fd62e1" Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.685707 4825 scope.go:117] "RemoveContainer" containerID="83e182613d2703afdd182aa2f39c6f72365e70af2ccbd781523c958879d281c0" Oct 07 19:56:32 crc kubenswrapper[4825]: E1007 19:56:32.686479 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83e182613d2703afdd182aa2f39c6f72365e70af2ccbd781523c958879d281c0\": container with ID starting with 83e182613d2703afdd182aa2f39c6f72365e70af2ccbd781523c958879d281c0 not found: ID does not exist" containerID="83e182613d2703afdd182aa2f39c6f72365e70af2ccbd781523c958879d281c0" Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.686564 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83e182613d2703afdd182aa2f39c6f72365e70af2ccbd781523c958879d281c0"} err="failed to get container status \"83e182613d2703afdd182aa2f39c6f72365e70af2ccbd781523c958879d281c0\": rpc error: code = NotFound desc = could not find container \"83e182613d2703afdd182aa2f39c6f72365e70af2ccbd781523c958879d281c0\": container with ID starting with 83e182613d2703afdd182aa2f39c6f72365e70af2ccbd781523c958879d281c0 not found: ID does not exist" Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.686606 4825 scope.go:117] "RemoveContainer" containerID="b43992c0c00e81d03ad0bfac65cb8e6417c0ce3ca19fe5507499765ea1320843" Oct 07 19:56:32 crc kubenswrapper[4825]: E1007 19:56:32.686974 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b43992c0c00e81d03ad0bfac65cb8e6417c0ce3ca19fe5507499765ea1320843\": container with ID starting with b43992c0c00e81d03ad0bfac65cb8e6417c0ce3ca19fe5507499765ea1320843 not found: ID does not exist" containerID="b43992c0c00e81d03ad0bfac65cb8e6417c0ce3ca19fe5507499765ea1320843" Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.687007 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b43992c0c00e81d03ad0bfac65cb8e6417c0ce3ca19fe5507499765ea1320843"} err="failed to get container status \"b43992c0c00e81d03ad0bfac65cb8e6417c0ce3ca19fe5507499765ea1320843\": rpc error: code = NotFound desc = could not find container \"b43992c0c00e81d03ad0bfac65cb8e6417c0ce3ca19fe5507499765ea1320843\": container with ID starting with b43992c0c00e81d03ad0bfac65cb8e6417c0ce3ca19fe5507499765ea1320843 not found: ID does not exist" Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.687025 4825 scope.go:117] "RemoveContainer" containerID="19fc6a179f4d658473680cc985d6a9b1fd2d79c4ff3ed4eefa73c83366fd62e1" Oct 07 19:56:32 crc kubenswrapper[4825]: E1007 19:56:32.687519 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19fc6a179f4d658473680cc985d6a9b1fd2d79c4ff3ed4eefa73c83366fd62e1\": container with ID starting with 19fc6a179f4d658473680cc985d6a9b1fd2d79c4ff3ed4eefa73c83366fd62e1 not found: ID does not exist" containerID="19fc6a179f4d658473680cc985d6a9b1fd2d79c4ff3ed4eefa73c83366fd62e1" Oct 07 19:56:32 crc kubenswrapper[4825]: I1007 19:56:32.687597 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19fc6a179f4d658473680cc985d6a9b1fd2d79c4ff3ed4eefa73c83366fd62e1"} err="failed to get container status \"19fc6a179f4d658473680cc985d6a9b1fd2d79c4ff3ed4eefa73c83366fd62e1\": rpc error: code = NotFound desc = could not find container \"19fc6a179f4d658473680cc985d6a9b1fd2d79c4ff3ed4eefa73c83366fd62e1\": container with ID starting with 19fc6a179f4d658473680cc985d6a9b1fd2d79c4ff3ed4eefa73c83366fd62e1 not found: ID does not exist" Oct 07 19:56:33 crc kubenswrapper[4825]: I1007 19:56:33.808619 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e96c7d8-aade-46a0-af47-f1c84699e947" path="/var/lib/kubelet/pods/3e96c7d8-aade-46a0-af47-f1c84699e947/volumes" Oct 07 19:56:38 crc kubenswrapper[4825]: I1007 19:56:38.796104 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 19:56:39 crc kubenswrapper[4825]: I1007 19:56:39.629463 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerStarted","Data":"746ab6b7314c4e4728ba5c166df7d076205c8ec7f2dab0742da8f3345d6dc112"} Oct 07 19:58:48 crc kubenswrapper[4825]: I1007 19:58:48.072182 4825 generic.go:334] "Generic (PLEG): container finished" podID="cf9823f2-5baf-49ba-9da5-a8f13ac66d75" containerID="d826273016649b3791f3a8a74afbcf053c3fb9bad627c31f8f3250eed148706e" exitCode=0 Oct 07 19:58:48 crc kubenswrapper[4825]: I1007 19:58:48.073022 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"cf9823f2-5baf-49ba-9da5-a8f13ac66d75","Type":"ContainerDied","Data":"d826273016649b3791f3a8a74afbcf053c3fb9bad627c31f8f3250eed148706e"} Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.426321 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.561798 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.561913 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-openstack-config-secret\") pod \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.561966 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-config-data\") pod \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.562059 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-openstack-config\") pod \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.562085 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-ca-certs\") pod \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.562116 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-test-operator-ephemeral-workdir\") pod \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.562149 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-test-operator-ephemeral-temporary\") pod \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.562210 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjbt4\" (UniqueName: \"kubernetes.io/projected/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-kube-api-access-cjbt4\") pod \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.562261 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-ssh-key\") pod \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\" (UID: \"cf9823f2-5baf-49ba-9da5-a8f13ac66d75\") " Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.562587 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "cf9823f2-5baf-49ba-9da5-a8f13ac66d75" (UID: "cf9823f2-5baf-49ba-9da5-a8f13ac66d75"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.562739 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-config-data" (OuterVolumeSpecName: "config-data") pod "cf9823f2-5baf-49ba-9da5-a8f13ac66d75" (UID: "cf9823f2-5baf-49ba-9da5-a8f13ac66d75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.563272 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.563296 4825 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.568316 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "cf9823f2-5baf-49ba-9da5-a8f13ac66d75" (UID: "cf9823f2-5baf-49ba-9da5-a8f13ac66d75"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.569416 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-kube-api-access-cjbt4" (OuterVolumeSpecName: "kube-api-access-cjbt4") pod "cf9823f2-5baf-49ba-9da5-a8f13ac66d75" (UID: "cf9823f2-5baf-49ba-9da5-a8f13ac66d75"). InnerVolumeSpecName "kube-api-access-cjbt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.570523 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "cf9823f2-5baf-49ba-9da5-a8f13ac66d75" (UID: "cf9823f2-5baf-49ba-9da5-a8f13ac66d75"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.591929 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "cf9823f2-5baf-49ba-9da5-a8f13ac66d75" (UID: "cf9823f2-5baf-49ba-9da5-a8f13ac66d75"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.594994 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cf9823f2-5baf-49ba-9da5-a8f13ac66d75" (UID: "cf9823f2-5baf-49ba-9da5-a8f13ac66d75"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.597543 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "cf9823f2-5baf-49ba-9da5-a8f13ac66d75" (UID: "cf9823f2-5baf-49ba-9da5-a8f13ac66d75"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.625152 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "cf9823f2-5baf-49ba-9da5-a8f13ac66d75" (UID: "cf9823f2-5baf-49ba-9da5-a8f13ac66d75"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.665932 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.666191 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.666207 4825 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.666244 4825 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.666261 4825 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.666271 4825 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.666281 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjbt4\" (UniqueName: \"kubernetes.io/projected/cf9823f2-5baf-49ba-9da5-a8f13ac66d75-kube-api-access-cjbt4\") on node \"crc\" DevicePath \"\"" Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.711395 4825 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 07 19:58:49 crc kubenswrapper[4825]: I1007 19:58:49.767751 4825 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 07 19:58:50 crc kubenswrapper[4825]: I1007 19:58:50.091849 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"cf9823f2-5baf-49ba-9da5-a8f13ac66d75","Type":"ContainerDied","Data":"83cd530afbd6ffc6a0ed355381c2097222402612c99714a0b2b2ac35d2c6e661"} Oct 07 19:58:50 crc kubenswrapper[4825]: I1007 19:58:50.092177 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83cd530afbd6ffc6a0ed355381c2097222402612c99714a0b2b2ac35d2c6e661" Oct 07 19:58:50 crc kubenswrapper[4825]: I1007 19:58:50.092275 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 07 19:58:51 crc kubenswrapper[4825]: I1007 19:58:51.964120 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 07 19:58:51 crc kubenswrapper[4825]: E1007 19:58:51.964920 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e96c7d8-aade-46a0-af47-f1c84699e947" containerName="extract-content" Oct 07 19:58:51 crc kubenswrapper[4825]: I1007 19:58:51.964935 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e96c7d8-aade-46a0-af47-f1c84699e947" containerName="extract-content" Oct 07 19:58:51 crc kubenswrapper[4825]: E1007 19:58:51.964948 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e96c7d8-aade-46a0-af47-f1c84699e947" containerName="extract-utilities" Oct 07 19:58:51 crc kubenswrapper[4825]: I1007 19:58:51.964956 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e96c7d8-aade-46a0-af47-f1c84699e947" containerName="extract-utilities" Oct 07 19:58:51 crc kubenswrapper[4825]: E1007 19:58:51.964978 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf9823f2-5baf-49ba-9da5-a8f13ac66d75" containerName="tempest-tests-tempest-tests-runner" Oct 07 19:58:51 crc kubenswrapper[4825]: I1007 19:58:51.964987 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf9823f2-5baf-49ba-9da5-a8f13ac66d75" containerName="tempest-tests-tempest-tests-runner" Oct 07 19:58:51 crc kubenswrapper[4825]: E1007 19:58:51.965012 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e96c7d8-aade-46a0-af47-f1c84699e947" containerName="registry-server" Oct 07 19:58:51 crc kubenswrapper[4825]: I1007 19:58:51.965020 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e96c7d8-aade-46a0-af47-f1c84699e947" containerName="registry-server" Oct 07 19:58:51 crc kubenswrapper[4825]: I1007 19:58:51.965275 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e96c7d8-aade-46a0-af47-f1c84699e947" containerName="registry-server" Oct 07 19:58:51 crc kubenswrapper[4825]: I1007 19:58:51.965313 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf9823f2-5baf-49ba-9da5-a8f13ac66d75" containerName="tempest-tests-tempest-tests-runner" Oct 07 19:58:51 crc kubenswrapper[4825]: I1007 19:58:51.966012 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 19:58:51 crc kubenswrapper[4825]: I1007 19:58:51.968143 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-ft7fd" Oct 07 19:58:51 crc kubenswrapper[4825]: I1007 19:58:51.984315 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 07 19:58:52 crc kubenswrapper[4825]: I1007 19:58:52.116365 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6a1f604e-4e0b-42a3-a2c5-0b42417baa2f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 19:58:52 crc kubenswrapper[4825]: I1007 19:58:52.116554 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjpx4\" (UniqueName: \"kubernetes.io/projected/6a1f604e-4e0b-42a3-a2c5-0b42417baa2f-kube-api-access-hjpx4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6a1f604e-4e0b-42a3-a2c5-0b42417baa2f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 19:58:52 crc kubenswrapper[4825]: I1007 19:58:52.218091 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjpx4\" (UniqueName: \"kubernetes.io/projected/6a1f604e-4e0b-42a3-a2c5-0b42417baa2f-kube-api-access-hjpx4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6a1f604e-4e0b-42a3-a2c5-0b42417baa2f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 19:58:52 crc kubenswrapper[4825]: I1007 19:58:52.218195 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6a1f604e-4e0b-42a3-a2c5-0b42417baa2f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 19:58:52 crc kubenswrapper[4825]: I1007 19:58:52.218623 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6a1f604e-4e0b-42a3-a2c5-0b42417baa2f\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 19:58:52 crc kubenswrapper[4825]: I1007 19:58:52.250449 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6a1f604e-4e0b-42a3-a2c5-0b42417baa2f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 19:58:52 crc kubenswrapper[4825]: I1007 19:58:52.251544 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjpx4\" (UniqueName: \"kubernetes.io/projected/6a1f604e-4e0b-42a3-a2c5-0b42417baa2f-kube-api-access-hjpx4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6a1f604e-4e0b-42a3-a2c5-0b42417baa2f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 19:58:52 crc kubenswrapper[4825]: I1007 19:58:52.316243 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 19:58:52 crc kubenswrapper[4825]: I1007 19:58:52.765491 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 07 19:58:52 crc kubenswrapper[4825]: I1007 19:58:52.771274 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 19:58:53 crc kubenswrapper[4825]: I1007 19:58:53.124917 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"6a1f604e-4e0b-42a3-a2c5-0b42417baa2f","Type":"ContainerStarted","Data":"cc8d6ed43f628607db04a32fe84203707d3adc6bb53d713243435c199f3a8676"} Oct 07 19:58:55 crc kubenswrapper[4825]: I1007 19:58:55.162626 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"6a1f604e-4e0b-42a3-a2c5-0b42417baa2f","Type":"ContainerStarted","Data":"248c12fb943bfc091035afe67ac5b181d49720d2ce8425737212ac1cddefa7ea"} Oct 07 19:58:55 crc kubenswrapper[4825]: I1007 19:58:55.201684 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.910635568 podStartE2EDuration="4.201653417s" podCreationTimestamp="2025-10-07 19:58:51 +0000 UTC" firstStartedPulling="2025-10-07 19:58:52.771010326 +0000 UTC m=+3521.593048963" lastFinishedPulling="2025-10-07 19:58:54.062028175 +0000 UTC m=+3522.884066812" observedRunningTime="2025-10-07 19:58:55.183478218 +0000 UTC m=+3524.005516945" watchObservedRunningTime="2025-10-07 19:58:55.201653417 +0000 UTC m=+3524.023692084" Oct 07 19:59:05 crc kubenswrapper[4825]: I1007 19:59:05.709291 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:59:05 crc kubenswrapper[4825]: I1007 19:59:05.709969 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 19:59:11 crc kubenswrapper[4825]: I1007 19:59:11.581324 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-phcqg/must-gather-r7lmv"] Oct 07 19:59:11 crc kubenswrapper[4825]: I1007 19:59:11.583312 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-phcqg/must-gather-r7lmv" Oct 07 19:59:11 crc kubenswrapper[4825]: I1007 19:59:11.586457 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-phcqg"/"openshift-service-ca.crt" Oct 07 19:59:11 crc kubenswrapper[4825]: I1007 19:59:11.586681 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-phcqg"/"kube-root-ca.crt" Oct 07 19:59:11 crc kubenswrapper[4825]: I1007 19:59:11.590067 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-phcqg/must-gather-r7lmv"] Oct 07 19:59:11 crc kubenswrapper[4825]: I1007 19:59:11.634584 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjq79\" (UniqueName: \"kubernetes.io/projected/4eafd191-a4e0-46a6-807e-810e66ef4eec-kube-api-access-zjq79\") pod \"must-gather-r7lmv\" (UID: \"4eafd191-a4e0-46a6-807e-810e66ef4eec\") " pod="openshift-must-gather-phcqg/must-gather-r7lmv" Oct 07 19:59:11 crc kubenswrapper[4825]: I1007 19:59:11.634727 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4eafd191-a4e0-46a6-807e-810e66ef4eec-must-gather-output\") pod \"must-gather-r7lmv\" (UID: \"4eafd191-a4e0-46a6-807e-810e66ef4eec\") " pod="openshift-must-gather-phcqg/must-gather-r7lmv" Oct 07 19:59:11 crc kubenswrapper[4825]: I1007 19:59:11.735948 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjq79\" (UniqueName: \"kubernetes.io/projected/4eafd191-a4e0-46a6-807e-810e66ef4eec-kube-api-access-zjq79\") pod \"must-gather-r7lmv\" (UID: \"4eafd191-a4e0-46a6-807e-810e66ef4eec\") " pod="openshift-must-gather-phcqg/must-gather-r7lmv" Oct 07 19:59:11 crc kubenswrapper[4825]: I1007 19:59:11.736044 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4eafd191-a4e0-46a6-807e-810e66ef4eec-must-gather-output\") pod \"must-gather-r7lmv\" (UID: \"4eafd191-a4e0-46a6-807e-810e66ef4eec\") " pod="openshift-must-gather-phcqg/must-gather-r7lmv" Oct 07 19:59:11 crc kubenswrapper[4825]: I1007 19:59:11.736536 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4eafd191-a4e0-46a6-807e-810e66ef4eec-must-gather-output\") pod \"must-gather-r7lmv\" (UID: \"4eafd191-a4e0-46a6-807e-810e66ef4eec\") " pod="openshift-must-gather-phcqg/must-gather-r7lmv" Oct 07 19:59:11 crc kubenswrapper[4825]: I1007 19:59:11.752491 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjq79\" (UniqueName: \"kubernetes.io/projected/4eafd191-a4e0-46a6-807e-810e66ef4eec-kube-api-access-zjq79\") pod \"must-gather-r7lmv\" (UID: \"4eafd191-a4e0-46a6-807e-810e66ef4eec\") " pod="openshift-must-gather-phcqg/must-gather-r7lmv" Oct 07 19:59:11 crc kubenswrapper[4825]: I1007 19:59:11.917831 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-phcqg/must-gather-r7lmv" Oct 07 19:59:12 crc kubenswrapper[4825]: I1007 19:59:12.510501 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-phcqg/must-gather-r7lmv"] Oct 07 19:59:13 crc kubenswrapper[4825]: I1007 19:59:13.363177 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-phcqg/must-gather-r7lmv" event={"ID":"4eafd191-a4e0-46a6-807e-810e66ef4eec","Type":"ContainerStarted","Data":"d4b85ac97e32dc5e64b343726a6a776e766126b4799cce4b1e1c81733e76fae0"} Oct 07 19:59:17 crc kubenswrapper[4825]: I1007 19:59:17.401889 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-phcqg/must-gather-r7lmv" event={"ID":"4eafd191-a4e0-46a6-807e-810e66ef4eec","Type":"ContainerStarted","Data":"3d34aa3e799fe2e67894188bcdb796621c86ac7a2f62faacbf5339bde7cd6974"} Oct 07 19:59:18 crc kubenswrapper[4825]: I1007 19:59:18.413137 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-phcqg/must-gather-r7lmv" event={"ID":"4eafd191-a4e0-46a6-807e-810e66ef4eec","Type":"ContainerStarted","Data":"a4f7e12674486d6146e146468603cb13388cacb44168543673a82b71fdf820bd"} Oct 07 19:59:18 crc kubenswrapper[4825]: I1007 19:59:18.435292 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-phcqg/must-gather-r7lmv" podStartSLOduration=3.052875793 podStartE2EDuration="7.435270626s" podCreationTimestamp="2025-10-07 19:59:11 +0000 UTC" firstStartedPulling="2025-10-07 19:59:12.514533854 +0000 UTC m=+3541.336572491" lastFinishedPulling="2025-10-07 19:59:16.896928687 +0000 UTC m=+3545.718967324" observedRunningTime="2025-10-07 19:59:18.430545176 +0000 UTC m=+3547.252583813" watchObservedRunningTime="2025-10-07 19:59:18.435270626 +0000 UTC m=+3547.257309273" Oct 07 19:59:20 crc kubenswrapper[4825]: I1007 19:59:20.875021 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-phcqg/crc-debug-hdp2n"] Oct 07 19:59:20 crc kubenswrapper[4825]: I1007 19:59:20.876501 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-phcqg/crc-debug-hdp2n" Oct 07 19:59:20 crc kubenswrapper[4825]: I1007 19:59:20.879695 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-phcqg"/"default-dockercfg-x5dfk" Oct 07 19:59:20 crc kubenswrapper[4825]: I1007 19:59:20.964645 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48hdb\" (UniqueName: \"kubernetes.io/projected/49a8430e-be93-43be-8996-21969f8a8fa6-kube-api-access-48hdb\") pod \"crc-debug-hdp2n\" (UID: \"49a8430e-be93-43be-8996-21969f8a8fa6\") " pod="openshift-must-gather-phcqg/crc-debug-hdp2n" Oct 07 19:59:20 crc kubenswrapper[4825]: I1007 19:59:20.964906 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/49a8430e-be93-43be-8996-21969f8a8fa6-host\") pod \"crc-debug-hdp2n\" (UID: \"49a8430e-be93-43be-8996-21969f8a8fa6\") " pod="openshift-must-gather-phcqg/crc-debug-hdp2n" Oct 07 19:59:21 crc kubenswrapper[4825]: I1007 19:59:21.067123 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48hdb\" (UniqueName: \"kubernetes.io/projected/49a8430e-be93-43be-8996-21969f8a8fa6-kube-api-access-48hdb\") pod \"crc-debug-hdp2n\" (UID: \"49a8430e-be93-43be-8996-21969f8a8fa6\") " pod="openshift-must-gather-phcqg/crc-debug-hdp2n" Oct 07 19:59:21 crc kubenswrapper[4825]: I1007 19:59:21.067165 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/49a8430e-be93-43be-8996-21969f8a8fa6-host\") pod \"crc-debug-hdp2n\" (UID: \"49a8430e-be93-43be-8996-21969f8a8fa6\") " pod="openshift-must-gather-phcqg/crc-debug-hdp2n" Oct 07 19:59:21 crc kubenswrapper[4825]: I1007 19:59:21.067356 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/49a8430e-be93-43be-8996-21969f8a8fa6-host\") pod \"crc-debug-hdp2n\" (UID: \"49a8430e-be93-43be-8996-21969f8a8fa6\") " pod="openshift-must-gather-phcqg/crc-debug-hdp2n" Oct 07 19:59:21 crc kubenswrapper[4825]: I1007 19:59:21.084947 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48hdb\" (UniqueName: \"kubernetes.io/projected/49a8430e-be93-43be-8996-21969f8a8fa6-kube-api-access-48hdb\") pod \"crc-debug-hdp2n\" (UID: \"49a8430e-be93-43be-8996-21969f8a8fa6\") " pod="openshift-must-gather-phcqg/crc-debug-hdp2n" Oct 07 19:59:21 crc kubenswrapper[4825]: I1007 19:59:21.200278 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-phcqg/crc-debug-hdp2n" Oct 07 19:59:21 crc kubenswrapper[4825]: W1007 19:59:21.255791 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49a8430e_be93_43be_8996_21969f8a8fa6.slice/crio-88857c7682d78919cf3761a633a66e411b42f13e6d9dda6e87c8c2b710108835 WatchSource:0}: Error finding container 88857c7682d78919cf3761a633a66e411b42f13e6d9dda6e87c8c2b710108835: Status 404 returned error can't find the container with id 88857c7682d78919cf3761a633a66e411b42f13e6d9dda6e87c8c2b710108835 Oct 07 19:59:21 crc kubenswrapper[4825]: I1007 19:59:21.441601 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-phcqg/crc-debug-hdp2n" event={"ID":"49a8430e-be93-43be-8996-21969f8a8fa6","Type":"ContainerStarted","Data":"88857c7682d78919cf3761a633a66e411b42f13e6d9dda6e87c8c2b710108835"} Oct 07 19:59:32 crc kubenswrapper[4825]: I1007 19:59:32.557318 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-phcqg/crc-debug-hdp2n" event={"ID":"49a8430e-be93-43be-8996-21969f8a8fa6","Type":"ContainerStarted","Data":"e75e895c4653c331d41e03141d836ddd6a18780952b183acc372378bc9bf3ce1"} Oct 07 19:59:32 crc kubenswrapper[4825]: I1007 19:59:32.575960 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-phcqg/crc-debug-hdp2n" podStartSLOduration=2.479338545 podStartE2EDuration="12.575939578s" podCreationTimestamp="2025-10-07 19:59:20 +0000 UTC" firstStartedPulling="2025-10-07 19:59:21.257780526 +0000 UTC m=+3550.079819163" lastFinishedPulling="2025-10-07 19:59:31.354381519 +0000 UTC m=+3560.176420196" observedRunningTime="2025-10-07 19:59:32.569764551 +0000 UTC m=+3561.391803188" watchObservedRunningTime="2025-10-07 19:59:32.575939578 +0000 UTC m=+3561.397978225" Oct 07 19:59:35 crc kubenswrapper[4825]: I1007 19:59:35.708560 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 19:59:35 crc kubenswrapper[4825]: I1007 19:59:35.709081 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 20:00:00 crc kubenswrapper[4825]: I1007 20:00:00.188741 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331120-4xpp9"] Oct 07 20:00:00 crc kubenswrapper[4825]: I1007 20:00:00.190823 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331120-4xpp9" Oct 07 20:00:00 crc kubenswrapper[4825]: I1007 20:00:00.194434 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 20:00:00 crc kubenswrapper[4825]: I1007 20:00:00.194602 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 20:00:00 crc kubenswrapper[4825]: I1007 20:00:00.208004 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331120-4xpp9"] Oct 07 20:00:00 crc kubenswrapper[4825]: I1007 20:00:00.308510 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdclz\" (UniqueName: \"kubernetes.io/projected/01d6161c-af0b-44fe-b160-e9e85f478032-kube-api-access-pdclz\") pod \"collect-profiles-29331120-4xpp9\" (UID: \"01d6161c-af0b-44fe-b160-e9e85f478032\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331120-4xpp9" Oct 07 20:00:00 crc kubenswrapper[4825]: I1007 20:00:00.308841 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01d6161c-af0b-44fe-b160-e9e85f478032-secret-volume\") pod \"collect-profiles-29331120-4xpp9\" (UID: \"01d6161c-af0b-44fe-b160-e9e85f478032\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331120-4xpp9" Oct 07 20:00:00 crc kubenswrapper[4825]: I1007 20:00:00.308901 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01d6161c-af0b-44fe-b160-e9e85f478032-config-volume\") pod \"collect-profiles-29331120-4xpp9\" (UID: \"01d6161c-af0b-44fe-b160-e9e85f478032\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331120-4xpp9" Oct 07 20:00:00 crc kubenswrapper[4825]: I1007 20:00:00.410908 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdclz\" (UniqueName: \"kubernetes.io/projected/01d6161c-af0b-44fe-b160-e9e85f478032-kube-api-access-pdclz\") pod \"collect-profiles-29331120-4xpp9\" (UID: \"01d6161c-af0b-44fe-b160-e9e85f478032\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331120-4xpp9" Oct 07 20:00:00 crc kubenswrapper[4825]: I1007 20:00:00.411025 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01d6161c-af0b-44fe-b160-e9e85f478032-secret-volume\") pod \"collect-profiles-29331120-4xpp9\" (UID: \"01d6161c-af0b-44fe-b160-e9e85f478032\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331120-4xpp9" Oct 07 20:00:00 crc kubenswrapper[4825]: I1007 20:00:00.411153 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01d6161c-af0b-44fe-b160-e9e85f478032-config-volume\") pod \"collect-profiles-29331120-4xpp9\" (UID: \"01d6161c-af0b-44fe-b160-e9e85f478032\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331120-4xpp9" Oct 07 20:00:00 crc kubenswrapper[4825]: I1007 20:00:00.412162 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01d6161c-af0b-44fe-b160-e9e85f478032-config-volume\") pod \"collect-profiles-29331120-4xpp9\" (UID: \"01d6161c-af0b-44fe-b160-e9e85f478032\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331120-4xpp9" Oct 07 20:00:00 crc kubenswrapper[4825]: I1007 20:00:00.418912 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01d6161c-af0b-44fe-b160-e9e85f478032-secret-volume\") pod \"collect-profiles-29331120-4xpp9\" (UID: \"01d6161c-af0b-44fe-b160-e9e85f478032\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331120-4xpp9" Oct 07 20:00:00 crc kubenswrapper[4825]: I1007 20:00:00.428648 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdclz\" (UniqueName: \"kubernetes.io/projected/01d6161c-af0b-44fe-b160-e9e85f478032-kube-api-access-pdclz\") pod \"collect-profiles-29331120-4xpp9\" (UID: \"01d6161c-af0b-44fe-b160-e9e85f478032\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331120-4xpp9" Oct 07 20:00:00 crc kubenswrapper[4825]: I1007 20:00:00.526526 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331120-4xpp9" Oct 07 20:00:01 crc kubenswrapper[4825]: I1007 20:00:01.011849 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331120-4xpp9"] Oct 07 20:00:01 crc kubenswrapper[4825]: W1007 20:00:01.025919 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01d6161c_af0b_44fe_b160_e9e85f478032.slice/crio-1d7ebb719236fc822cc4cae2ff9541948061238be558e1641628b34a57cf55fd WatchSource:0}: Error finding container 1d7ebb719236fc822cc4cae2ff9541948061238be558e1641628b34a57cf55fd: Status 404 returned error can't find the container with id 1d7ebb719236fc822cc4cae2ff9541948061238be558e1641628b34a57cf55fd Oct 07 20:00:01 crc kubenswrapper[4825]: I1007 20:00:01.853538 4825 generic.go:334] "Generic (PLEG): container finished" podID="01d6161c-af0b-44fe-b160-e9e85f478032" containerID="ee175064298dd64e84824274eb1841372b1c05e6ca4f8ab0ee4a840f453942fc" exitCode=0 Oct 07 20:00:01 crc kubenswrapper[4825]: I1007 20:00:01.853633 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331120-4xpp9" event={"ID":"01d6161c-af0b-44fe-b160-e9e85f478032","Type":"ContainerDied","Data":"ee175064298dd64e84824274eb1841372b1c05e6ca4f8ab0ee4a840f453942fc"} Oct 07 20:00:01 crc kubenswrapper[4825]: I1007 20:00:01.854034 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331120-4xpp9" event={"ID":"01d6161c-af0b-44fe-b160-e9e85f478032","Type":"ContainerStarted","Data":"1d7ebb719236fc822cc4cae2ff9541948061238be558e1641628b34a57cf55fd"} Oct 07 20:00:03 crc kubenswrapper[4825]: I1007 20:00:03.172560 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331120-4xpp9" Oct 07 20:00:03 crc kubenswrapper[4825]: I1007 20:00:03.272286 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01d6161c-af0b-44fe-b160-e9e85f478032-config-volume\") pod \"01d6161c-af0b-44fe-b160-e9e85f478032\" (UID: \"01d6161c-af0b-44fe-b160-e9e85f478032\") " Oct 07 20:00:03 crc kubenswrapper[4825]: I1007 20:00:03.272563 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01d6161c-af0b-44fe-b160-e9e85f478032-secret-volume\") pod \"01d6161c-af0b-44fe-b160-e9e85f478032\" (UID: \"01d6161c-af0b-44fe-b160-e9e85f478032\") " Oct 07 20:00:03 crc kubenswrapper[4825]: I1007 20:00:03.272638 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdclz\" (UniqueName: \"kubernetes.io/projected/01d6161c-af0b-44fe-b160-e9e85f478032-kube-api-access-pdclz\") pod \"01d6161c-af0b-44fe-b160-e9e85f478032\" (UID: \"01d6161c-af0b-44fe-b160-e9e85f478032\") " Oct 07 20:00:03 crc kubenswrapper[4825]: I1007 20:00:03.273018 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01d6161c-af0b-44fe-b160-e9e85f478032-config-volume" (OuterVolumeSpecName: "config-volume") pod "01d6161c-af0b-44fe-b160-e9e85f478032" (UID: "01d6161c-af0b-44fe-b160-e9e85f478032"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 20:00:03 crc kubenswrapper[4825]: I1007 20:00:03.273216 4825 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01d6161c-af0b-44fe-b160-e9e85f478032-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 20:00:03 crc kubenswrapper[4825]: I1007 20:00:03.280377 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01d6161c-af0b-44fe-b160-e9e85f478032-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "01d6161c-af0b-44fe-b160-e9e85f478032" (UID: "01d6161c-af0b-44fe-b160-e9e85f478032"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 20:00:03 crc kubenswrapper[4825]: I1007 20:00:03.285419 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01d6161c-af0b-44fe-b160-e9e85f478032-kube-api-access-pdclz" (OuterVolumeSpecName: "kube-api-access-pdclz") pod "01d6161c-af0b-44fe-b160-e9e85f478032" (UID: "01d6161c-af0b-44fe-b160-e9e85f478032"). InnerVolumeSpecName "kube-api-access-pdclz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 20:00:03 crc kubenswrapper[4825]: I1007 20:00:03.374424 4825 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01d6161c-af0b-44fe-b160-e9e85f478032-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 20:00:03 crc kubenswrapper[4825]: I1007 20:00:03.374453 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdclz\" (UniqueName: \"kubernetes.io/projected/01d6161c-af0b-44fe-b160-e9e85f478032-kube-api-access-pdclz\") on node \"crc\" DevicePath \"\"" Oct 07 20:00:03 crc kubenswrapper[4825]: I1007 20:00:03.879296 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331120-4xpp9" event={"ID":"01d6161c-af0b-44fe-b160-e9e85f478032","Type":"ContainerDied","Data":"1d7ebb719236fc822cc4cae2ff9541948061238be558e1641628b34a57cf55fd"} Oct 07 20:00:03 crc kubenswrapper[4825]: I1007 20:00:03.880310 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d7ebb719236fc822cc4cae2ff9541948061238be558e1641628b34a57cf55fd" Oct 07 20:00:03 crc kubenswrapper[4825]: I1007 20:00:03.879327 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331120-4xpp9" Oct 07 20:00:04 crc kubenswrapper[4825]: I1007 20:00:04.240094 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331075-rqvtx"] Oct 07 20:00:04 crc kubenswrapper[4825]: I1007 20:00:04.248844 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331075-rqvtx"] Oct 07 20:00:05 crc kubenswrapper[4825]: I1007 20:00:05.708484 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 20:00:05 crc kubenswrapper[4825]: I1007 20:00:05.709010 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 20:00:05 crc kubenswrapper[4825]: I1007 20:00:05.709053 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 20:00:05 crc kubenswrapper[4825]: I1007 20:00:05.709709 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"746ab6b7314c4e4728ba5c166df7d076205c8ec7f2dab0742da8f3345d6dc112"} pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 20:00:05 crc kubenswrapper[4825]: I1007 20:00:05.709753 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" containerID="cri-o://746ab6b7314c4e4728ba5c166df7d076205c8ec7f2dab0742da8f3345d6dc112" gracePeriod=600 Oct 07 20:00:05 crc kubenswrapper[4825]: I1007 20:00:05.807997 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f311fd9-7ab5-4cf4-80ed-701ed5d212ef" path="/var/lib/kubelet/pods/0f311fd9-7ab5-4cf4-80ed-701ed5d212ef/volumes" Oct 07 20:00:05 crc kubenswrapper[4825]: I1007 20:00:05.929125 4825 generic.go:334] "Generic (PLEG): container finished" podID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerID="746ab6b7314c4e4728ba5c166df7d076205c8ec7f2dab0742da8f3345d6dc112" exitCode=0 Oct 07 20:00:05 crc kubenswrapper[4825]: I1007 20:00:05.929289 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerDied","Data":"746ab6b7314c4e4728ba5c166df7d076205c8ec7f2dab0742da8f3345d6dc112"} Oct 07 20:00:05 crc kubenswrapper[4825]: I1007 20:00:05.929441 4825 scope.go:117] "RemoveContainer" containerID="626fbe9693f82a4f34103290a1d1828d2cadfd6fb07a904dc0d2e6247e2eff94" Oct 07 20:00:06 crc kubenswrapper[4825]: I1007 20:00:06.946470 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerStarted","Data":"680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962"} Oct 07 20:00:22 crc kubenswrapper[4825]: I1007 20:00:22.347194 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-657f997574-lnlbm_dc023b5f-d12b-4ce6-9cc6-1bac1fa48455/barbican-api-log/0.log" Oct 07 20:00:22 crc kubenswrapper[4825]: I1007 20:00:22.351810 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-657f997574-lnlbm_dc023b5f-d12b-4ce6-9cc6-1bac1fa48455/barbican-api/0.log" Oct 07 20:00:22 crc kubenswrapper[4825]: I1007 20:00:22.521570 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-594d76bc86-m9c6m_ea2e502f-f902-43be-989a-2f0ed4e3ae02/barbican-keystone-listener/0.log" Oct 07 20:00:22 crc kubenswrapper[4825]: I1007 20:00:22.572905 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-594d76bc86-m9c6m_ea2e502f-f902-43be-989a-2f0ed4e3ae02/barbican-keystone-listener-log/0.log" Oct 07 20:00:22 crc kubenswrapper[4825]: I1007 20:00:22.794657 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-664844745f-dvzxg_56d9e279-5942-4a24-84db-5d7f8fcabcba/barbican-worker/0.log" Oct 07 20:00:22 crc kubenswrapper[4825]: I1007 20:00:22.823129 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-664844745f-dvzxg_56d9e279-5942-4a24-84db-5d7f8fcabcba/barbican-worker-log/0.log" Oct 07 20:00:22 crc kubenswrapper[4825]: I1007 20:00:22.998510 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2_ede848ae-130b-4c5c-a4fb-873d9ea65cb6/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:00:23 crc kubenswrapper[4825]: I1007 20:00:23.191271 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5553aae5-6efa-4d21-bbb7-f2c0f23071b3/ceilometer-central-agent/0.log" Oct 07 20:00:23 crc kubenswrapper[4825]: I1007 20:00:23.248911 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5553aae5-6efa-4d21-bbb7-f2c0f23071b3/proxy-httpd/0.log" Oct 07 20:00:23 crc kubenswrapper[4825]: I1007 20:00:23.257596 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5553aae5-6efa-4d21-bbb7-f2c0f23071b3/ceilometer-notification-agent/0.log" Oct 07 20:00:23 crc kubenswrapper[4825]: I1007 20:00:23.370022 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5553aae5-6efa-4d21-bbb7-f2c0f23071b3/sg-core/0.log" Oct 07 20:00:23 crc kubenswrapper[4825]: I1007 20:00:23.473541 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_598ea581-8e2b-47f6-8360-3907ab4c3f49/cinder-api/0.log" Oct 07 20:00:23 crc kubenswrapper[4825]: I1007 20:00:23.698548 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_598ea581-8e2b-47f6-8360-3907ab4c3f49/cinder-api-log/0.log" Oct 07 20:00:23 crc kubenswrapper[4825]: I1007 20:00:23.858250 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_06faa8d0-8aff-4422-a2cb-8643f6e920a8/cinder-scheduler/0.log" Oct 07 20:00:23 crc kubenswrapper[4825]: I1007 20:00:23.947286 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_06faa8d0-8aff-4422-a2cb-8643f6e920a8/probe/0.log" Oct 07 20:00:24 crc kubenswrapper[4825]: I1007 20:00:24.083855 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl_ad194ba9-9675-4a8e-be19-b44964a5b493/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:00:24 crc kubenswrapper[4825]: I1007 20:00:24.295220 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z_26532318-7138-4557-9814-febc4ba75fb8/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:00:24 crc kubenswrapper[4825]: I1007 20:00:24.440519 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn_da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:00:24 crc kubenswrapper[4825]: I1007 20:00:24.511371 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-gkpsf_4dbb4b22-9fab-40a8-8fee-1d77e4e37c80/init/0.log" Oct 07 20:00:24 crc kubenswrapper[4825]: I1007 20:00:24.698708 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-gkpsf_4dbb4b22-9fab-40a8-8fee-1d77e4e37c80/dnsmasq-dns/0.log" Oct 07 20:00:24 crc kubenswrapper[4825]: I1007 20:00:24.717553 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-gkpsf_4dbb4b22-9fab-40a8-8fee-1d77e4e37c80/init/0.log" Oct 07 20:00:24 crc kubenswrapper[4825]: I1007 20:00:24.916822 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-ngshx_35f68c5c-870d-448d-a680-decef3790f6b/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:00:25 crc kubenswrapper[4825]: I1007 20:00:25.068404 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_1356ee9f-f727-42b6-9a53-f80e78720704/glance-httpd/0.log" Oct 07 20:00:25 crc kubenswrapper[4825]: I1007 20:00:25.105150 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_1356ee9f-f727-42b6-9a53-f80e78720704/glance-log/0.log" Oct 07 20:00:25 crc kubenswrapper[4825]: I1007 20:00:25.302005 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_747f4079-112d-4889-937f-fc39c9d75819/glance-httpd/0.log" Oct 07 20:00:25 crc kubenswrapper[4825]: I1007 20:00:25.306029 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_747f4079-112d-4889-937f-fc39c9d75819/glance-log/0.log" Oct 07 20:00:25 crc kubenswrapper[4825]: I1007 20:00:25.701517 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-58d7dd5b56-nhlgz_710a139f-bf12-4021-b702-3e40d49febf1/horizon/0.log" Oct 07 20:00:25 crc kubenswrapper[4825]: I1007 20:00:25.765689 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr_e3a03ee0-54d8-44d6-94fb-59a5bbed04fd/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:00:25 crc kubenswrapper[4825]: I1007 20:00:25.851053 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-58d7dd5b56-nhlgz_710a139f-bf12-4021-b702-3e40d49febf1/horizon-log/0.log" Oct 07 20:00:25 crc kubenswrapper[4825]: I1007 20:00:25.956864 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-tb8kn_c2e86406-64eb-4c0c-8f9d-38b2a64ddc48/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:00:26 crc kubenswrapper[4825]: I1007 20:00:26.149776 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e/kube-state-metrics/0.log" Oct 07 20:00:26 crc kubenswrapper[4825]: I1007 20:00:26.182671 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6b848888b7-8bpk8_a0cf82d8-414d-4486-9cef-be5b38e75745/keystone-api/0.log" Oct 07 20:00:26 crc kubenswrapper[4825]: I1007 20:00:26.354635 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw_6a978ccd-af77-4892-9bae-0f87170eb4a1/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:00:26 crc kubenswrapper[4825]: I1007 20:00:26.796463 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d47b47d5-hc6q5_e07eddca-def8-4a86-8d72-0c916ba6b6c1/neutron-httpd/0.log" Oct 07 20:00:26 crc kubenswrapper[4825]: I1007 20:00:26.947758 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d47b47d5-hc6q5_e07eddca-def8-4a86-8d72-0c916ba6b6c1/neutron-api/0.log" Oct 07 20:00:27 crc kubenswrapper[4825]: I1007 20:00:27.015681 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759_e22aff57-c4de-445a-b196-23d2e791a10f/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:00:27 crc kubenswrapper[4825]: I1007 20:00:27.521171 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6a7f4fc7-89f3-4e32-94fe-f4117c1ca522/nova-api-log/0.log" Oct 07 20:00:27 crc kubenswrapper[4825]: I1007 20:00:27.719496 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6a7f4fc7-89f3-4e32-94fe-f4117c1ca522/nova-api-api/0.log" Oct 07 20:00:28 crc kubenswrapper[4825]: I1007 20:00:28.023502 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_395c3018-72fe-4e48-a92d-e98026e550a3/nova-cell0-conductor-conductor/0.log" Oct 07 20:00:28 crc kubenswrapper[4825]: I1007 20:00:28.071245 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_df2bca65-1a0f-4e1c-ba16-5b18bb7e71b7/nova-cell1-conductor-conductor/0.log" Oct 07 20:00:28 crc kubenswrapper[4825]: I1007 20:00:28.305204 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_89c586f2-b817-4c06-92cf-8b7832e8acc6/nova-cell1-novncproxy-novncproxy/0.log" Oct 07 20:00:28 crc kubenswrapper[4825]: I1007 20:00:28.503437 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-l9nwl_3c1f45e7-330e-4c79-8609-2988aac67b05/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:00:28 crc kubenswrapper[4825]: I1007 20:00:28.588356 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c4c4cfbe-20c8-402c-90b0-040fbbb0d58e/nova-metadata-log/0.log" Oct 07 20:00:28 crc kubenswrapper[4825]: I1007 20:00:28.972746 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_146610c1-1e58-4a52-ba58-b190f53f4a03/nova-scheduler-scheduler/0.log" Oct 07 20:00:29 crc kubenswrapper[4825]: I1007 20:00:29.188243 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6451a8c0-c6b1-4098-846d-24fe8c26d849/mysql-bootstrap/0.log" Oct 07 20:00:29 crc kubenswrapper[4825]: I1007 20:00:29.404259 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6451a8c0-c6b1-4098-846d-24fe8c26d849/mysql-bootstrap/0.log" Oct 07 20:00:29 crc kubenswrapper[4825]: I1007 20:00:29.424234 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6451a8c0-c6b1-4098-846d-24fe8c26d849/galera/0.log" Oct 07 20:00:29 crc kubenswrapper[4825]: I1007 20:00:29.659093 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_fa0ba0a4-872f-4ebd-8ee1-0e57174648a9/mysql-bootstrap/0.log" Oct 07 20:00:29 crc kubenswrapper[4825]: I1007 20:00:29.876691 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_fa0ba0a4-872f-4ebd-8ee1-0e57174648a9/mysql-bootstrap/0.log" Oct 07 20:00:29 crc kubenswrapper[4825]: I1007 20:00:29.906054 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_fa0ba0a4-872f-4ebd-8ee1-0e57174648a9/galera/0.log" Oct 07 20:00:30 crc kubenswrapper[4825]: I1007 20:00:30.014866 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c4c4cfbe-20c8-402c-90b0-040fbbb0d58e/nova-metadata-metadata/0.log" Oct 07 20:00:30 crc kubenswrapper[4825]: I1007 20:00:30.155730 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_44d41a47-16c3-4bd1-be08-b06bd6f8734f/openstackclient/0.log" Oct 07 20:00:30 crc kubenswrapper[4825]: I1007 20:00:30.348685 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-j2dqr_5476bb52-18e5-41e6-b087-3cd2d6e81a87/openstack-network-exporter/0.log" Oct 07 20:00:30 crc kubenswrapper[4825]: I1007 20:00:30.474806 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-mqtlv_0392f085-cd23-439c-b8aa-e3c94fc320b8/ovn-controller/0.log" Oct 07 20:00:30 crc kubenswrapper[4825]: I1007 20:00:30.625203 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9zcg2_16ff2637-d49f-4b3b-b3f4-b731b51e8875/ovsdb-server-init/0.log" Oct 07 20:00:30 crc kubenswrapper[4825]: I1007 20:00:30.845865 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9zcg2_16ff2637-d49f-4b3b-b3f4-b731b51e8875/ovsdb-server-init/0.log" Oct 07 20:00:30 crc kubenswrapper[4825]: I1007 20:00:30.849115 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9zcg2_16ff2637-d49f-4b3b-b3f4-b731b51e8875/ovs-vswitchd/0.log" Oct 07 20:00:30 crc kubenswrapper[4825]: I1007 20:00:30.865972 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9zcg2_16ff2637-d49f-4b3b-b3f4-b731b51e8875/ovsdb-server/0.log" Oct 07 20:00:31 crc kubenswrapper[4825]: I1007 20:00:31.156945 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-xxqvz_a32b4f65-af6c-4bed-a97c-ec9ced0b4c45/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:00:31 crc kubenswrapper[4825]: I1007 20:00:31.493748 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58/openstack-network-exporter/0.log" Oct 07 20:00:31 crc kubenswrapper[4825]: I1007 20:00:31.518295 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58/ovn-northd/0.log" Oct 07 20:00:31 crc kubenswrapper[4825]: I1007 20:00:31.692654 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_96ff0bc5-e277-4f6a-a3b3-815e01ac42b7/ovsdbserver-nb/0.log" Oct 07 20:00:31 crc kubenswrapper[4825]: I1007 20:00:31.771432 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_96ff0bc5-e277-4f6a-a3b3-815e01ac42b7/openstack-network-exporter/0.log" Oct 07 20:00:31 crc kubenswrapper[4825]: I1007 20:00:31.904928 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bf48b556-d051-49b5-b9fb-fa6b325e0f79/openstack-network-exporter/0.log" Oct 07 20:00:31 crc kubenswrapper[4825]: I1007 20:00:31.987576 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bf48b556-d051-49b5-b9fb-fa6b325e0f79/ovsdbserver-sb/0.log" Oct 07 20:00:32 crc kubenswrapper[4825]: I1007 20:00:32.155708 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7f4b8c987b-kjdd8_4297247b-64e7-4379-aa35-9e2bf6d2d5d5/placement-api/0.log" Oct 07 20:00:32 crc kubenswrapper[4825]: I1007 20:00:32.296597 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7f4b8c987b-kjdd8_4297247b-64e7-4379-aa35-9e2bf6d2d5d5/placement-log/0.log" Oct 07 20:00:32 crc kubenswrapper[4825]: I1007 20:00:32.423541 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e773083b-ae36-44eb-bb82-18b12b504439/setup-container/0.log" Oct 07 20:00:32 crc kubenswrapper[4825]: I1007 20:00:32.595706 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e773083b-ae36-44eb-bb82-18b12b504439/setup-container/0.log" Oct 07 20:00:32 crc kubenswrapper[4825]: I1007 20:00:32.632452 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e773083b-ae36-44eb-bb82-18b12b504439/rabbitmq/0.log" Oct 07 20:00:32 crc kubenswrapper[4825]: I1007 20:00:32.811459 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_18c777f8-aad0-482a-b132-ad417d64eb6e/setup-container/0.log" Oct 07 20:00:33 crc kubenswrapper[4825]: I1007 20:00:33.043725 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_18c777f8-aad0-482a-b132-ad417d64eb6e/setup-container/0.log" Oct 07 20:00:33 crc kubenswrapper[4825]: I1007 20:00:33.099582 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_18c777f8-aad0-482a-b132-ad417d64eb6e/rabbitmq/0.log" Oct 07 20:00:33 crc kubenswrapper[4825]: I1007 20:00:33.296863 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn_5cd31618-4e62-438f-b168-1d322052785d/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:00:33 crc kubenswrapper[4825]: I1007 20:00:33.368209 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-q2wcd_7f652c6b-fc94-47dc-90ec-a19d7e49d728/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:00:33 crc kubenswrapper[4825]: I1007 20:00:33.541692 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw_00148c9a-f926-4ff0-a78a-239fae3968d5/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:00:33 crc kubenswrapper[4825]: I1007 20:00:33.706146 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-8kr2p_fdf4da6b-b218-4ab1-87c6-7b8cfcef6810/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:00:33 crc kubenswrapper[4825]: I1007 20:00:33.837709 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-d55wk_63e9c706-689c-43be-a9a3-67f20fbfea88/ssh-known-hosts-edpm-deployment/0.log" Oct 07 20:00:34 crc kubenswrapper[4825]: I1007 20:00:34.036487 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-57f8b4b869-t42c2_aeabd5f0-6573-402d-a5df-c0bc41d16a67/proxy-server/0.log" Oct 07 20:00:34 crc kubenswrapper[4825]: I1007 20:00:34.126296 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-57f8b4b869-t42c2_aeabd5f0-6573-402d-a5df-c0bc41d16a67/proxy-httpd/0.log" Oct 07 20:00:34 crc kubenswrapper[4825]: I1007 20:00:34.240999 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-whwz4_13a46859-41a7-4783-9c3d-be9e48db5526/swift-ring-rebalance/0.log" Oct 07 20:00:34 crc kubenswrapper[4825]: I1007 20:00:34.378069 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/account-auditor/0.log" Oct 07 20:00:34 crc kubenswrapper[4825]: I1007 20:00:34.614579 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/account-reaper/0.log" Oct 07 20:00:34 crc kubenswrapper[4825]: I1007 20:00:34.734915 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/account-server/0.log" Oct 07 20:00:34 crc kubenswrapper[4825]: I1007 20:00:34.789568 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/container-auditor/0.log" Oct 07 20:00:34 crc kubenswrapper[4825]: I1007 20:00:34.809544 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/account-replicator/0.log" Oct 07 20:00:34 crc kubenswrapper[4825]: I1007 20:00:34.949421 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/container-server/0.log" Oct 07 20:00:34 crc kubenswrapper[4825]: I1007 20:00:34.991349 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/container-replicator/0.log" Oct 07 20:00:35 crc kubenswrapper[4825]: I1007 20:00:35.035755 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/container-updater/0.log" Oct 07 20:00:35 crc kubenswrapper[4825]: I1007 20:00:35.172318 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/object-expirer/0.log" Oct 07 20:00:35 crc kubenswrapper[4825]: I1007 20:00:35.267832 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/object-auditor/0.log" Oct 07 20:00:35 crc kubenswrapper[4825]: I1007 20:00:35.289015 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/object-replicator/0.log" Oct 07 20:00:35 crc kubenswrapper[4825]: I1007 20:00:35.399701 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/object-server/0.log" Oct 07 20:00:35 crc kubenswrapper[4825]: I1007 20:00:35.479523 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/object-updater/0.log" Oct 07 20:00:35 crc kubenswrapper[4825]: I1007 20:00:35.487189 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/rsync/0.log" Oct 07 20:00:35 crc kubenswrapper[4825]: I1007 20:00:35.607337 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/swift-recon-cron/0.log" Oct 07 20:00:35 crc kubenswrapper[4825]: I1007 20:00:35.792819 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv_803e0c1d-979b-47da-ba63-cad0323972a8/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:00:35 crc kubenswrapper[4825]: I1007 20:00:35.940562 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_cf9823f2-5baf-49ba-9da5-a8f13ac66d75/tempest-tests-tempest-tests-runner/0.log" Oct 07 20:00:36 crc kubenswrapper[4825]: I1007 20:00:36.089724 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_6a1f604e-4e0b-42a3-a2c5-0b42417baa2f/test-operator-logs-container/0.log" Oct 07 20:00:36 crc kubenswrapper[4825]: I1007 20:00:36.241265 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx_b0c642ee-a887-496b-a212-48601b94af99/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:00:41 crc kubenswrapper[4825]: I1007 20:00:41.400796 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_8448c74b-bea3-42c0-95da-ab251a90ca9f/memcached/0.log" Oct 07 20:00:50 crc kubenswrapper[4825]: I1007 20:00:50.143738 4825 scope.go:117] "RemoveContainer" containerID="1c9a330ee06367529325684f48ea3c7d3f7debbc6089f4b5c52cf6b88464fa9c" Oct 07 20:01:00 crc kubenswrapper[4825]: I1007 20:01:00.153913 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29331121-n45sq"] Oct 07 20:01:00 crc kubenswrapper[4825]: E1007 20:01:00.154766 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01d6161c-af0b-44fe-b160-e9e85f478032" containerName="collect-profiles" Oct 07 20:01:00 crc kubenswrapper[4825]: I1007 20:01:00.154778 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d6161c-af0b-44fe-b160-e9e85f478032" containerName="collect-profiles" Oct 07 20:01:00 crc kubenswrapper[4825]: I1007 20:01:00.154961 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="01d6161c-af0b-44fe-b160-e9e85f478032" containerName="collect-profiles" Oct 07 20:01:00 crc kubenswrapper[4825]: I1007 20:01:00.155590 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29331121-n45sq" Oct 07 20:01:00 crc kubenswrapper[4825]: I1007 20:01:00.167063 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29331121-n45sq"] Oct 07 20:01:00 crc kubenswrapper[4825]: I1007 20:01:00.229300 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f51d5f-d0ae-4123-969f-8bda81cbfb85-config-data\") pod \"keystone-cron-29331121-n45sq\" (UID: \"48f51d5f-d0ae-4123-969f-8bda81cbfb85\") " pod="openstack/keystone-cron-29331121-n45sq" Oct 07 20:01:00 crc kubenswrapper[4825]: I1007 20:01:00.229353 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgnjc\" (UniqueName: \"kubernetes.io/projected/48f51d5f-d0ae-4123-969f-8bda81cbfb85-kube-api-access-hgnjc\") pod \"keystone-cron-29331121-n45sq\" (UID: \"48f51d5f-d0ae-4123-969f-8bda81cbfb85\") " pod="openstack/keystone-cron-29331121-n45sq" Oct 07 20:01:00 crc kubenswrapper[4825]: I1007 20:01:00.229429 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f51d5f-d0ae-4123-969f-8bda81cbfb85-combined-ca-bundle\") pod \"keystone-cron-29331121-n45sq\" (UID: \"48f51d5f-d0ae-4123-969f-8bda81cbfb85\") " pod="openstack/keystone-cron-29331121-n45sq" Oct 07 20:01:00 crc kubenswrapper[4825]: I1007 20:01:00.229520 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48f51d5f-d0ae-4123-969f-8bda81cbfb85-fernet-keys\") pod \"keystone-cron-29331121-n45sq\" (UID: \"48f51d5f-d0ae-4123-969f-8bda81cbfb85\") " pod="openstack/keystone-cron-29331121-n45sq" Oct 07 20:01:00 crc kubenswrapper[4825]: I1007 20:01:00.331095 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f51d5f-d0ae-4123-969f-8bda81cbfb85-combined-ca-bundle\") pod \"keystone-cron-29331121-n45sq\" (UID: \"48f51d5f-d0ae-4123-969f-8bda81cbfb85\") " pod="openstack/keystone-cron-29331121-n45sq" Oct 07 20:01:00 crc kubenswrapper[4825]: I1007 20:01:00.331216 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48f51d5f-d0ae-4123-969f-8bda81cbfb85-fernet-keys\") pod \"keystone-cron-29331121-n45sq\" (UID: \"48f51d5f-d0ae-4123-969f-8bda81cbfb85\") " pod="openstack/keystone-cron-29331121-n45sq" Oct 07 20:01:00 crc kubenswrapper[4825]: I1007 20:01:00.331301 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f51d5f-d0ae-4123-969f-8bda81cbfb85-config-data\") pod \"keystone-cron-29331121-n45sq\" (UID: \"48f51d5f-d0ae-4123-969f-8bda81cbfb85\") " pod="openstack/keystone-cron-29331121-n45sq" Oct 07 20:01:00 crc kubenswrapper[4825]: I1007 20:01:00.331336 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgnjc\" (UniqueName: \"kubernetes.io/projected/48f51d5f-d0ae-4123-969f-8bda81cbfb85-kube-api-access-hgnjc\") pod \"keystone-cron-29331121-n45sq\" (UID: \"48f51d5f-d0ae-4123-969f-8bda81cbfb85\") " pod="openstack/keystone-cron-29331121-n45sq" Oct 07 20:01:00 crc kubenswrapper[4825]: I1007 20:01:00.340791 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48f51d5f-d0ae-4123-969f-8bda81cbfb85-fernet-keys\") pod \"keystone-cron-29331121-n45sq\" (UID: \"48f51d5f-d0ae-4123-969f-8bda81cbfb85\") " pod="openstack/keystone-cron-29331121-n45sq" Oct 07 20:01:00 crc kubenswrapper[4825]: I1007 20:01:00.340940 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f51d5f-d0ae-4123-969f-8bda81cbfb85-config-data\") pod \"keystone-cron-29331121-n45sq\" (UID: \"48f51d5f-d0ae-4123-969f-8bda81cbfb85\") " pod="openstack/keystone-cron-29331121-n45sq" Oct 07 20:01:00 crc kubenswrapper[4825]: I1007 20:01:00.344491 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f51d5f-d0ae-4123-969f-8bda81cbfb85-combined-ca-bundle\") pod \"keystone-cron-29331121-n45sq\" (UID: \"48f51d5f-d0ae-4123-969f-8bda81cbfb85\") " pod="openstack/keystone-cron-29331121-n45sq" Oct 07 20:01:00 crc kubenswrapper[4825]: I1007 20:01:00.360976 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgnjc\" (UniqueName: \"kubernetes.io/projected/48f51d5f-d0ae-4123-969f-8bda81cbfb85-kube-api-access-hgnjc\") pod \"keystone-cron-29331121-n45sq\" (UID: \"48f51d5f-d0ae-4123-969f-8bda81cbfb85\") " pod="openstack/keystone-cron-29331121-n45sq" Oct 07 20:01:00 crc kubenswrapper[4825]: I1007 20:01:00.533644 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29331121-n45sq" Oct 07 20:01:01 crc kubenswrapper[4825]: I1007 20:01:01.001074 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29331121-n45sq"] Oct 07 20:01:01 crc kubenswrapper[4825]: I1007 20:01:01.477155 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29331121-n45sq" event={"ID":"48f51d5f-d0ae-4123-969f-8bda81cbfb85","Type":"ContainerStarted","Data":"84ed277a7ff7393c1355320ddfddf6cff3124528e001c3c7071233a740b98538"} Oct 07 20:01:01 crc kubenswrapper[4825]: I1007 20:01:01.478662 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29331121-n45sq" event={"ID":"48f51d5f-d0ae-4123-969f-8bda81cbfb85","Type":"ContainerStarted","Data":"f3aa40729b5d606431e920e1a7c0dd3685428c25ff7299f0345b4803d686d8e7"} Oct 07 20:01:01 crc kubenswrapper[4825]: I1007 20:01:01.493945 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29331121-n45sq" podStartSLOduration=1.493927626 podStartE2EDuration="1.493927626s" podCreationTimestamp="2025-10-07 20:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 20:01:01.493341797 +0000 UTC m=+3650.315380434" watchObservedRunningTime="2025-10-07 20:01:01.493927626 +0000 UTC m=+3650.315966263" Oct 07 20:01:03 crc kubenswrapper[4825]: I1007 20:01:03.501915 4825 generic.go:334] "Generic (PLEG): container finished" podID="48f51d5f-d0ae-4123-969f-8bda81cbfb85" containerID="84ed277a7ff7393c1355320ddfddf6cff3124528e001c3c7071233a740b98538" exitCode=0 Oct 07 20:01:03 crc kubenswrapper[4825]: I1007 20:01:03.501998 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29331121-n45sq" event={"ID":"48f51d5f-d0ae-4123-969f-8bda81cbfb85","Type":"ContainerDied","Data":"84ed277a7ff7393c1355320ddfddf6cff3124528e001c3c7071233a740b98538"} Oct 07 20:01:04 crc kubenswrapper[4825]: I1007 20:01:04.920326 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29331121-n45sq" Oct 07 20:01:05 crc kubenswrapper[4825]: I1007 20:01:05.062511 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f51d5f-d0ae-4123-969f-8bda81cbfb85-combined-ca-bundle\") pod \"48f51d5f-d0ae-4123-969f-8bda81cbfb85\" (UID: \"48f51d5f-d0ae-4123-969f-8bda81cbfb85\") " Oct 07 20:01:05 crc kubenswrapper[4825]: I1007 20:01:05.062971 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48f51d5f-d0ae-4123-969f-8bda81cbfb85-fernet-keys\") pod \"48f51d5f-d0ae-4123-969f-8bda81cbfb85\" (UID: \"48f51d5f-d0ae-4123-969f-8bda81cbfb85\") " Oct 07 20:01:05 crc kubenswrapper[4825]: I1007 20:01:05.063059 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f51d5f-d0ae-4123-969f-8bda81cbfb85-config-data\") pod \"48f51d5f-d0ae-4123-969f-8bda81cbfb85\" (UID: \"48f51d5f-d0ae-4123-969f-8bda81cbfb85\") " Oct 07 20:01:05 crc kubenswrapper[4825]: I1007 20:01:05.063125 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgnjc\" (UniqueName: \"kubernetes.io/projected/48f51d5f-d0ae-4123-969f-8bda81cbfb85-kube-api-access-hgnjc\") pod \"48f51d5f-d0ae-4123-969f-8bda81cbfb85\" (UID: \"48f51d5f-d0ae-4123-969f-8bda81cbfb85\") " Oct 07 20:01:05 crc kubenswrapper[4825]: I1007 20:01:05.068890 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48f51d5f-d0ae-4123-969f-8bda81cbfb85-kube-api-access-hgnjc" (OuterVolumeSpecName: "kube-api-access-hgnjc") pod "48f51d5f-d0ae-4123-969f-8bda81cbfb85" (UID: "48f51d5f-d0ae-4123-969f-8bda81cbfb85"). InnerVolumeSpecName "kube-api-access-hgnjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 20:01:05 crc kubenswrapper[4825]: I1007 20:01:05.081746 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48f51d5f-d0ae-4123-969f-8bda81cbfb85-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "48f51d5f-d0ae-4123-969f-8bda81cbfb85" (UID: "48f51d5f-d0ae-4123-969f-8bda81cbfb85"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 20:01:05 crc kubenswrapper[4825]: I1007 20:01:05.089663 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48f51d5f-d0ae-4123-969f-8bda81cbfb85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48f51d5f-d0ae-4123-969f-8bda81cbfb85" (UID: "48f51d5f-d0ae-4123-969f-8bda81cbfb85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 20:01:05 crc kubenswrapper[4825]: I1007 20:01:05.116207 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48f51d5f-d0ae-4123-969f-8bda81cbfb85-config-data" (OuterVolumeSpecName: "config-data") pod "48f51d5f-d0ae-4123-969f-8bda81cbfb85" (UID: "48f51d5f-d0ae-4123-969f-8bda81cbfb85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 20:01:05 crc kubenswrapper[4825]: I1007 20:01:05.164867 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f51d5f-d0ae-4123-969f-8bda81cbfb85-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 20:01:05 crc kubenswrapper[4825]: I1007 20:01:05.164907 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgnjc\" (UniqueName: \"kubernetes.io/projected/48f51d5f-d0ae-4123-969f-8bda81cbfb85-kube-api-access-hgnjc\") on node \"crc\" DevicePath \"\"" Oct 07 20:01:05 crc kubenswrapper[4825]: I1007 20:01:05.164920 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f51d5f-d0ae-4123-969f-8bda81cbfb85-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 20:01:05 crc kubenswrapper[4825]: I1007 20:01:05.164931 4825 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48f51d5f-d0ae-4123-969f-8bda81cbfb85-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 20:01:05 crc kubenswrapper[4825]: I1007 20:01:05.525470 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29331121-n45sq" event={"ID":"48f51d5f-d0ae-4123-969f-8bda81cbfb85","Type":"ContainerDied","Data":"f3aa40729b5d606431e920e1a7c0dd3685428c25ff7299f0345b4803d686d8e7"} Oct 07 20:01:05 crc kubenswrapper[4825]: I1007 20:01:05.525819 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3aa40729b5d606431e920e1a7c0dd3685428c25ff7299f0345b4803d686d8e7" Oct 07 20:01:05 crc kubenswrapper[4825]: I1007 20:01:05.525783 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29331121-n45sq" Oct 07 20:01:20 crc kubenswrapper[4825]: I1007 20:01:20.695190 4825 generic.go:334] "Generic (PLEG): container finished" podID="49a8430e-be93-43be-8996-21969f8a8fa6" containerID="e75e895c4653c331d41e03141d836ddd6a18780952b183acc372378bc9bf3ce1" exitCode=0 Oct 07 20:01:20 crc kubenswrapper[4825]: I1007 20:01:20.695302 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-phcqg/crc-debug-hdp2n" event={"ID":"49a8430e-be93-43be-8996-21969f8a8fa6","Type":"ContainerDied","Data":"e75e895c4653c331d41e03141d836ddd6a18780952b183acc372378bc9bf3ce1"} Oct 07 20:01:21 crc kubenswrapper[4825]: I1007 20:01:21.832684 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-phcqg/crc-debug-hdp2n" Oct 07 20:01:21 crc kubenswrapper[4825]: I1007 20:01:21.877611 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-phcqg/crc-debug-hdp2n"] Oct 07 20:01:21 crc kubenswrapper[4825]: I1007 20:01:21.886655 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-phcqg/crc-debug-hdp2n"] Oct 07 20:01:21 crc kubenswrapper[4825]: I1007 20:01:21.976978 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/49a8430e-be93-43be-8996-21969f8a8fa6-host\") pod \"49a8430e-be93-43be-8996-21969f8a8fa6\" (UID: \"49a8430e-be93-43be-8996-21969f8a8fa6\") " Oct 07 20:01:21 crc kubenswrapper[4825]: I1007 20:01:21.977079 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48hdb\" (UniqueName: \"kubernetes.io/projected/49a8430e-be93-43be-8996-21969f8a8fa6-kube-api-access-48hdb\") pod \"49a8430e-be93-43be-8996-21969f8a8fa6\" (UID: \"49a8430e-be93-43be-8996-21969f8a8fa6\") " Oct 07 20:01:21 crc kubenswrapper[4825]: I1007 20:01:21.977067 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49a8430e-be93-43be-8996-21969f8a8fa6-host" (OuterVolumeSpecName: "host") pod "49a8430e-be93-43be-8996-21969f8a8fa6" (UID: "49a8430e-be93-43be-8996-21969f8a8fa6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 20:01:21 crc kubenswrapper[4825]: I1007 20:01:21.977736 4825 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/49a8430e-be93-43be-8996-21969f8a8fa6-host\") on node \"crc\" DevicePath \"\"" Oct 07 20:01:21 crc kubenswrapper[4825]: I1007 20:01:21.986059 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49a8430e-be93-43be-8996-21969f8a8fa6-kube-api-access-48hdb" (OuterVolumeSpecName: "kube-api-access-48hdb") pod "49a8430e-be93-43be-8996-21969f8a8fa6" (UID: "49a8430e-be93-43be-8996-21969f8a8fa6"). InnerVolumeSpecName "kube-api-access-48hdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 20:01:22 crc kubenswrapper[4825]: I1007 20:01:22.080290 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48hdb\" (UniqueName: \"kubernetes.io/projected/49a8430e-be93-43be-8996-21969f8a8fa6-kube-api-access-48hdb\") on node \"crc\" DevicePath \"\"" Oct 07 20:01:22 crc kubenswrapper[4825]: I1007 20:01:22.726779 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88857c7682d78919cf3761a633a66e411b42f13e6d9dda6e87c8c2b710108835" Oct 07 20:01:22 crc kubenswrapper[4825]: I1007 20:01:22.726811 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-phcqg/crc-debug-hdp2n" Oct 07 20:01:23 crc kubenswrapper[4825]: I1007 20:01:23.100139 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-phcqg/crc-debug-d4dkz"] Oct 07 20:01:23 crc kubenswrapper[4825]: E1007 20:01:23.100856 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f51d5f-d0ae-4123-969f-8bda81cbfb85" containerName="keystone-cron" Oct 07 20:01:23 crc kubenswrapper[4825]: I1007 20:01:23.100889 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f51d5f-d0ae-4123-969f-8bda81cbfb85" containerName="keystone-cron" Oct 07 20:01:23 crc kubenswrapper[4825]: E1007 20:01:23.100935 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a8430e-be93-43be-8996-21969f8a8fa6" containerName="container-00" Oct 07 20:01:23 crc kubenswrapper[4825]: I1007 20:01:23.100952 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a8430e-be93-43be-8996-21969f8a8fa6" containerName="container-00" Oct 07 20:01:23 crc kubenswrapper[4825]: I1007 20:01:23.101415 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="49a8430e-be93-43be-8996-21969f8a8fa6" containerName="container-00" Oct 07 20:01:23 crc kubenswrapper[4825]: I1007 20:01:23.101476 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f51d5f-d0ae-4123-969f-8bda81cbfb85" containerName="keystone-cron" Oct 07 20:01:23 crc kubenswrapper[4825]: I1007 20:01:23.102793 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-phcqg/crc-debug-d4dkz" Oct 07 20:01:23 crc kubenswrapper[4825]: I1007 20:01:23.105976 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-phcqg"/"default-dockercfg-x5dfk" Oct 07 20:01:23 crc kubenswrapper[4825]: I1007 20:01:23.207214 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psgk4\" (UniqueName: \"kubernetes.io/projected/3265535a-7a3d-4692-a181-053faff3a4fd-kube-api-access-psgk4\") pod \"crc-debug-d4dkz\" (UID: \"3265535a-7a3d-4692-a181-053faff3a4fd\") " pod="openshift-must-gather-phcqg/crc-debug-d4dkz" Oct 07 20:01:23 crc kubenswrapper[4825]: I1007 20:01:23.207500 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3265535a-7a3d-4692-a181-053faff3a4fd-host\") pod \"crc-debug-d4dkz\" (UID: \"3265535a-7a3d-4692-a181-053faff3a4fd\") " pod="openshift-must-gather-phcqg/crc-debug-d4dkz" Oct 07 20:01:23 crc kubenswrapper[4825]: I1007 20:01:23.309784 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3265535a-7a3d-4692-a181-053faff3a4fd-host\") pod \"crc-debug-d4dkz\" (UID: \"3265535a-7a3d-4692-a181-053faff3a4fd\") " pod="openshift-must-gather-phcqg/crc-debug-d4dkz" Oct 07 20:01:23 crc kubenswrapper[4825]: I1007 20:01:23.310010 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3265535a-7a3d-4692-a181-053faff3a4fd-host\") pod \"crc-debug-d4dkz\" (UID: \"3265535a-7a3d-4692-a181-053faff3a4fd\") " pod="openshift-must-gather-phcqg/crc-debug-d4dkz" Oct 07 20:01:23 crc kubenswrapper[4825]: I1007 20:01:23.310044 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psgk4\" (UniqueName: \"kubernetes.io/projected/3265535a-7a3d-4692-a181-053faff3a4fd-kube-api-access-psgk4\") pod \"crc-debug-d4dkz\" (UID: \"3265535a-7a3d-4692-a181-053faff3a4fd\") " pod="openshift-must-gather-phcqg/crc-debug-d4dkz" Oct 07 20:01:23 crc kubenswrapper[4825]: I1007 20:01:23.347686 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psgk4\" (UniqueName: \"kubernetes.io/projected/3265535a-7a3d-4692-a181-053faff3a4fd-kube-api-access-psgk4\") pod \"crc-debug-d4dkz\" (UID: \"3265535a-7a3d-4692-a181-053faff3a4fd\") " pod="openshift-must-gather-phcqg/crc-debug-d4dkz" Oct 07 20:01:23 crc kubenswrapper[4825]: I1007 20:01:23.431305 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-phcqg/crc-debug-d4dkz" Oct 07 20:01:23 crc kubenswrapper[4825]: W1007 20:01:23.477118 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3265535a_7a3d_4692_a181_053faff3a4fd.slice/crio-12fb90bc04292909ea4e478d532b070145604f98bc5ec828b8fb362b6d8b01d6 WatchSource:0}: Error finding container 12fb90bc04292909ea4e478d532b070145604f98bc5ec828b8fb362b6d8b01d6: Status 404 returned error can't find the container with id 12fb90bc04292909ea4e478d532b070145604f98bc5ec828b8fb362b6d8b01d6 Oct 07 20:01:23 crc kubenswrapper[4825]: I1007 20:01:23.749405 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-phcqg/crc-debug-d4dkz" event={"ID":"3265535a-7a3d-4692-a181-053faff3a4fd","Type":"ContainerStarted","Data":"d6a45b03eeb4a94960d14b07f6be694d11743377ed547faee8f74683cc3c0def"} Oct 07 20:01:23 crc kubenswrapper[4825]: I1007 20:01:23.749487 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-phcqg/crc-debug-d4dkz" event={"ID":"3265535a-7a3d-4692-a181-053faff3a4fd","Type":"ContainerStarted","Data":"12fb90bc04292909ea4e478d532b070145604f98bc5ec828b8fb362b6d8b01d6"} Oct 07 20:01:23 crc kubenswrapper[4825]: I1007 20:01:23.776166 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-phcqg/crc-debug-d4dkz" podStartSLOduration=0.776145113 podStartE2EDuration="776.145113ms" podCreationTimestamp="2025-10-07 20:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 20:01:23.764471536 +0000 UTC m=+3672.586510193" watchObservedRunningTime="2025-10-07 20:01:23.776145113 +0000 UTC m=+3672.598183760" Oct 07 20:01:23 crc kubenswrapper[4825]: I1007 20:01:23.808607 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49a8430e-be93-43be-8996-21969f8a8fa6" path="/var/lib/kubelet/pods/49a8430e-be93-43be-8996-21969f8a8fa6/volumes" Oct 07 20:01:24 crc kubenswrapper[4825]: I1007 20:01:24.757590 4825 generic.go:334] "Generic (PLEG): container finished" podID="3265535a-7a3d-4692-a181-053faff3a4fd" containerID="d6a45b03eeb4a94960d14b07f6be694d11743377ed547faee8f74683cc3c0def" exitCode=0 Oct 07 20:01:24 crc kubenswrapper[4825]: I1007 20:01:24.757964 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-phcqg/crc-debug-d4dkz" event={"ID":"3265535a-7a3d-4692-a181-053faff3a4fd","Type":"ContainerDied","Data":"d6a45b03eeb4a94960d14b07f6be694d11743377ed547faee8f74683cc3c0def"} Oct 07 20:01:25 crc kubenswrapper[4825]: I1007 20:01:25.857251 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-phcqg/crc-debug-d4dkz" Oct 07 20:01:25 crc kubenswrapper[4825]: I1007 20:01:25.972886 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psgk4\" (UniqueName: \"kubernetes.io/projected/3265535a-7a3d-4692-a181-053faff3a4fd-kube-api-access-psgk4\") pod \"3265535a-7a3d-4692-a181-053faff3a4fd\" (UID: \"3265535a-7a3d-4692-a181-053faff3a4fd\") " Oct 07 20:01:25 crc kubenswrapper[4825]: I1007 20:01:25.973130 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3265535a-7a3d-4692-a181-053faff3a4fd-host\") pod \"3265535a-7a3d-4692-a181-053faff3a4fd\" (UID: \"3265535a-7a3d-4692-a181-053faff3a4fd\") " Oct 07 20:01:25 crc kubenswrapper[4825]: I1007 20:01:25.973264 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3265535a-7a3d-4692-a181-053faff3a4fd-host" (OuterVolumeSpecName: "host") pod "3265535a-7a3d-4692-a181-053faff3a4fd" (UID: "3265535a-7a3d-4692-a181-053faff3a4fd"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 20:01:25 crc kubenswrapper[4825]: I1007 20:01:25.973557 4825 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3265535a-7a3d-4692-a181-053faff3a4fd-host\") on node \"crc\" DevicePath \"\"" Oct 07 20:01:25 crc kubenswrapper[4825]: I1007 20:01:25.983918 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3265535a-7a3d-4692-a181-053faff3a4fd-kube-api-access-psgk4" (OuterVolumeSpecName: "kube-api-access-psgk4") pod "3265535a-7a3d-4692-a181-053faff3a4fd" (UID: "3265535a-7a3d-4692-a181-053faff3a4fd"). InnerVolumeSpecName "kube-api-access-psgk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 20:01:26 crc kubenswrapper[4825]: I1007 20:01:26.074864 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psgk4\" (UniqueName: \"kubernetes.io/projected/3265535a-7a3d-4692-a181-053faff3a4fd-kube-api-access-psgk4\") on node \"crc\" DevicePath \"\"" Oct 07 20:01:26 crc kubenswrapper[4825]: I1007 20:01:26.775372 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-phcqg/crc-debug-d4dkz" event={"ID":"3265535a-7a3d-4692-a181-053faff3a4fd","Type":"ContainerDied","Data":"12fb90bc04292909ea4e478d532b070145604f98bc5ec828b8fb362b6d8b01d6"} Oct 07 20:01:26 crc kubenswrapper[4825]: I1007 20:01:26.775692 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12fb90bc04292909ea4e478d532b070145604f98bc5ec828b8fb362b6d8b01d6" Oct 07 20:01:26 crc kubenswrapper[4825]: I1007 20:01:26.775421 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-phcqg/crc-debug-d4dkz" Oct 07 20:01:30 crc kubenswrapper[4825]: I1007 20:01:30.517400 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-phcqg/crc-debug-d4dkz"] Oct 07 20:01:30 crc kubenswrapper[4825]: I1007 20:01:30.530829 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-phcqg/crc-debug-d4dkz"] Oct 07 20:01:31 crc kubenswrapper[4825]: I1007 20:01:31.724653 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-phcqg/crc-debug-jgg6h"] Oct 07 20:01:31 crc kubenswrapper[4825]: E1007 20:01:31.726858 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3265535a-7a3d-4692-a181-053faff3a4fd" containerName="container-00" Oct 07 20:01:31 crc kubenswrapper[4825]: I1007 20:01:31.727079 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3265535a-7a3d-4692-a181-053faff3a4fd" containerName="container-00" Oct 07 20:01:31 crc kubenswrapper[4825]: I1007 20:01:31.727702 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3265535a-7a3d-4692-a181-053faff3a4fd" containerName="container-00" Oct 07 20:01:31 crc kubenswrapper[4825]: I1007 20:01:31.728987 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-phcqg/crc-debug-jgg6h" Oct 07 20:01:31 crc kubenswrapper[4825]: I1007 20:01:31.735689 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-phcqg"/"default-dockercfg-x5dfk" Oct 07 20:01:31 crc kubenswrapper[4825]: I1007 20:01:31.813660 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3265535a-7a3d-4692-a181-053faff3a4fd" path="/var/lib/kubelet/pods/3265535a-7a3d-4692-a181-053faff3a4fd/volumes" Oct 07 20:01:31 crc kubenswrapper[4825]: I1007 20:01:31.874263 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph8cw\" (UniqueName: \"kubernetes.io/projected/446fe837-d75b-4b37-9d3a-1d6a3a7ce319-kube-api-access-ph8cw\") pod \"crc-debug-jgg6h\" (UID: \"446fe837-d75b-4b37-9d3a-1d6a3a7ce319\") " pod="openshift-must-gather-phcqg/crc-debug-jgg6h" Oct 07 20:01:31 crc kubenswrapper[4825]: I1007 20:01:31.874596 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/446fe837-d75b-4b37-9d3a-1d6a3a7ce319-host\") pod \"crc-debug-jgg6h\" (UID: \"446fe837-d75b-4b37-9d3a-1d6a3a7ce319\") " pod="openshift-must-gather-phcqg/crc-debug-jgg6h" Oct 07 20:01:31 crc kubenswrapper[4825]: I1007 20:01:31.977180 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/446fe837-d75b-4b37-9d3a-1d6a3a7ce319-host\") pod \"crc-debug-jgg6h\" (UID: \"446fe837-d75b-4b37-9d3a-1d6a3a7ce319\") " pod="openshift-must-gather-phcqg/crc-debug-jgg6h" Oct 07 20:01:31 crc kubenswrapper[4825]: I1007 20:01:31.977409 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph8cw\" (UniqueName: \"kubernetes.io/projected/446fe837-d75b-4b37-9d3a-1d6a3a7ce319-kube-api-access-ph8cw\") pod \"crc-debug-jgg6h\" (UID: \"446fe837-d75b-4b37-9d3a-1d6a3a7ce319\") " pod="openshift-must-gather-phcqg/crc-debug-jgg6h" Oct 07 20:01:31 crc kubenswrapper[4825]: I1007 20:01:31.979753 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/446fe837-d75b-4b37-9d3a-1d6a3a7ce319-host\") pod \"crc-debug-jgg6h\" (UID: \"446fe837-d75b-4b37-9d3a-1d6a3a7ce319\") " pod="openshift-must-gather-phcqg/crc-debug-jgg6h" Oct 07 20:01:32 crc kubenswrapper[4825]: I1007 20:01:32.017211 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph8cw\" (UniqueName: \"kubernetes.io/projected/446fe837-d75b-4b37-9d3a-1d6a3a7ce319-kube-api-access-ph8cw\") pod \"crc-debug-jgg6h\" (UID: \"446fe837-d75b-4b37-9d3a-1d6a3a7ce319\") " pod="openshift-must-gather-phcqg/crc-debug-jgg6h" Oct 07 20:01:32 crc kubenswrapper[4825]: I1007 20:01:32.063146 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-phcqg/crc-debug-jgg6h" Oct 07 20:01:32 crc kubenswrapper[4825]: I1007 20:01:32.850353 4825 generic.go:334] "Generic (PLEG): container finished" podID="446fe837-d75b-4b37-9d3a-1d6a3a7ce319" containerID="24bfd5512590dace582148acf1e5270b39b91fecaa861251d0ee498eb7466b02" exitCode=0 Oct 07 20:01:32 crc kubenswrapper[4825]: I1007 20:01:32.850757 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-phcqg/crc-debug-jgg6h" event={"ID":"446fe837-d75b-4b37-9d3a-1d6a3a7ce319","Type":"ContainerDied","Data":"24bfd5512590dace582148acf1e5270b39b91fecaa861251d0ee498eb7466b02"} Oct 07 20:01:32 crc kubenswrapper[4825]: I1007 20:01:32.850800 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-phcqg/crc-debug-jgg6h" event={"ID":"446fe837-d75b-4b37-9d3a-1d6a3a7ce319","Type":"ContainerStarted","Data":"1fdf271bd2809f7687ee8c52ecfb7b0255c51984a51a807bf2192c15e0167577"} Oct 07 20:01:32 crc kubenswrapper[4825]: I1007 20:01:32.911172 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-phcqg/crc-debug-jgg6h"] Oct 07 20:01:32 crc kubenswrapper[4825]: I1007 20:01:32.922269 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-phcqg/crc-debug-jgg6h"] Oct 07 20:01:33 crc kubenswrapper[4825]: I1007 20:01:33.961870 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-phcqg/crc-debug-jgg6h" Oct 07 20:01:34 crc kubenswrapper[4825]: I1007 20:01:34.029788 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph8cw\" (UniqueName: \"kubernetes.io/projected/446fe837-d75b-4b37-9d3a-1d6a3a7ce319-kube-api-access-ph8cw\") pod \"446fe837-d75b-4b37-9d3a-1d6a3a7ce319\" (UID: \"446fe837-d75b-4b37-9d3a-1d6a3a7ce319\") " Oct 07 20:01:34 crc kubenswrapper[4825]: I1007 20:01:34.029832 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/446fe837-d75b-4b37-9d3a-1d6a3a7ce319-host\") pod \"446fe837-d75b-4b37-9d3a-1d6a3a7ce319\" (UID: \"446fe837-d75b-4b37-9d3a-1d6a3a7ce319\") " Oct 07 20:01:34 crc kubenswrapper[4825]: I1007 20:01:34.029947 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/446fe837-d75b-4b37-9d3a-1d6a3a7ce319-host" (OuterVolumeSpecName: "host") pod "446fe837-d75b-4b37-9d3a-1d6a3a7ce319" (UID: "446fe837-d75b-4b37-9d3a-1d6a3a7ce319"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 20:01:34 crc kubenswrapper[4825]: I1007 20:01:34.030219 4825 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/446fe837-d75b-4b37-9d3a-1d6a3a7ce319-host\") on node \"crc\" DevicePath \"\"" Oct 07 20:01:34 crc kubenswrapper[4825]: I1007 20:01:34.035368 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/446fe837-d75b-4b37-9d3a-1d6a3a7ce319-kube-api-access-ph8cw" (OuterVolumeSpecName: "kube-api-access-ph8cw") pod "446fe837-d75b-4b37-9d3a-1d6a3a7ce319" (UID: "446fe837-d75b-4b37-9d3a-1d6a3a7ce319"). InnerVolumeSpecName "kube-api-access-ph8cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 20:01:34 crc kubenswrapper[4825]: I1007 20:01:34.130798 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph8cw\" (UniqueName: \"kubernetes.io/projected/446fe837-d75b-4b37-9d3a-1d6a3a7ce319-kube-api-access-ph8cw\") on node \"crc\" DevicePath \"\"" Oct 07 20:01:34 crc kubenswrapper[4825]: I1007 20:01:34.689477 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2_513bdedd-0708-4b31-afc3-93beee6324dd/util/0.log" Oct 07 20:01:34 crc kubenswrapper[4825]: I1007 20:01:34.807244 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2_513bdedd-0708-4b31-afc3-93beee6324dd/util/0.log" Oct 07 20:01:34 crc kubenswrapper[4825]: I1007 20:01:34.812569 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2_513bdedd-0708-4b31-afc3-93beee6324dd/pull/0.log" Oct 07 20:01:34 crc kubenswrapper[4825]: I1007 20:01:34.860425 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2_513bdedd-0708-4b31-afc3-93beee6324dd/pull/0.log" Oct 07 20:01:34 crc kubenswrapper[4825]: I1007 20:01:34.873936 4825 scope.go:117] "RemoveContainer" containerID="24bfd5512590dace582148acf1e5270b39b91fecaa861251d0ee498eb7466b02" Oct 07 20:01:34 crc kubenswrapper[4825]: I1007 20:01:34.874050 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-phcqg/crc-debug-jgg6h" Oct 07 20:01:35 crc kubenswrapper[4825]: I1007 20:01:35.073057 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2_513bdedd-0708-4b31-afc3-93beee6324dd/pull/0.log" Oct 07 20:01:35 crc kubenswrapper[4825]: I1007 20:01:35.100916 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2_513bdedd-0708-4b31-afc3-93beee6324dd/extract/0.log" Oct 07 20:01:35 crc kubenswrapper[4825]: I1007 20:01:35.103639 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2_513bdedd-0708-4b31-afc3-93beee6324dd/util/0.log" Oct 07 20:01:35 crc kubenswrapper[4825]: I1007 20:01:35.205120 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-mrnwx_97dc66cd-4313-4951-b85c-dedd5cd2e6ba/kube-rbac-proxy/0.log" Oct 07 20:01:35 crc kubenswrapper[4825]: I1007 20:01:35.288444 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-v4dk9_ef71a3c8-e986-4f19-a234-9e9ef7749132/kube-rbac-proxy/0.log" Oct 07 20:01:35 crc kubenswrapper[4825]: I1007 20:01:35.316170 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-mrnwx_97dc66cd-4313-4951-b85c-dedd5cd2e6ba/manager/0.log" Oct 07 20:01:35 crc kubenswrapper[4825]: I1007 20:01:35.390372 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-v4dk9_ef71a3c8-e986-4f19-a234-9e9ef7749132/manager/0.log" Oct 07 20:01:35 crc kubenswrapper[4825]: I1007 20:01:35.448308 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-ln2cd_c310d873-fc90-4658-ab06-ffa16a97c784/kube-rbac-proxy/0.log" Oct 07 20:01:35 crc kubenswrapper[4825]: I1007 20:01:35.501555 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-ln2cd_c310d873-fc90-4658-ab06-ffa16a97c784/manager/0.log" Oct 07 20:01:35 crc kubenswrapper[4825]: I1007 20:01:35.607463 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-n28d6_a0f7df98-caae-40a5-bb89-94123bce0763/kube-rbac-proxy/0.log" Oct 07 20:01:35 crc kubenswrapper[4825]: I1007 20:01:35.710744 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-n28d6_a0f7df98-caae-40a5-bb89-94123bce0763/manager/0.log" Oct 07 20:01:35 crc kubenswrapper[4825]: I1007 20:01:35.763881 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-jrh49_62bd3185-8c68-419d-b523-2de43d8dd015/manager/0.log" Oct 07 20:01:35 crc kubenswrapper[4825]: I1007 20:01:35.780263 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-jrh49_62bd3185-8c68-419d-b523-2de43d8dd015/kube-rbac-proxy/0.log" Oct 07 20:01:35 crc kubenswrapper[4825]: I1007 20:01:35.804944 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="446fe837-d75b-4b37-9d3a-1d6a3a7ce319" path="/var/lib/kubelet/pods/446fe837-d75b-4b37-9d3a-1d6a3a7ce319/volumes" Oct 07 20:01:35 crc kubenswrapper[4825]: I1007 20:01:35.893753 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-b9tp8_282406b3-2501-4b01-adf1-d952fc240404/kube-rbac-proxy/0.log" Oct 07 20:01:35 crc kubenswrapper[4825]: I1007 20:01:35.922980 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-b9tp8_282406b3-2501-4b01-adf1-d952fc240404/manager/0.log" Oct 07 20:01:36 crc kubenswrapper[4825]: I1007 20:01:36.043688 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-f9pdx_1b11f862-ee30-4996-a8fb-218b3c27f07a/kube-rbac-proxy/0.log" Oct 07 20:01:36 crc kubenswrapper[4825]: I1007 20:01:36.186096 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-m87x4_3f63b792-0ed9-453e-8dff-afac52bac339/kube-rbac-proxy/0.log" Oct 07 20:01:36 crc kubenswrapper[4825]: I1007 20:01:36.220311 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-f9pdx_1b11f862-ee30-4996-a8fb-218b3c27f07a/manager/0.log" Oct 07 20:01:36 crc kubenswrapper[4825]: I1007 20:01:36.268712 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-m87x4_3f63b792-0ed9-453e-8dff-afac52bac339/manager/0.log" Oct 07 20:01:36 crc kubenswrapper[4825]: I1007 20:01:36.340007 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-5mng4_528dd884-a7df-4574-920f-86ae0d779b62/kube-rbac-proxy/0.log" Oct 07 20:01:36 crc kubenswrapper[4825]: I1007 20:01:36.425592 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-5mng4_528dd884-a7df-4574-920f-86ae0d779b62/manager/0.log" Oct 07 20:01:36 crc kubenswrapper[4825]: I1007 20:01:36.515096 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-9nndm_9e61b6db-a40e-4ce3-8086-e51bbc6f6295/kube-rbac-proxy/0.log" Oct 07 20:01:36 crc kubenswrapper[4825]: I1007 20:01:36.631453 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm_22f18be3-b165-4b14-90bd-3eac19ae3fee/kube-rbac-proxy/0.log" Oct 07 20:01:36 crc kubenswrapper[4825]: I1007 20:01:36.632213 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-9nndm_9e61b6db-a40e-4ce3-8086-e51bbc6f6295/manager/0.log" Oct 07 20:01:36 crc kubenswrapper[4825]: I1007 20:01:36.692684 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm_22f18be3-b165-4b14-90bd-3eac19ae3fee/manager/0.log" Oct 07 20:01:36 crc kubenswrapper[4825]: I1007 20:01:36.811931 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-fpkzs_01529960-5bd1-4a4d-8703-8d6a3ff38d4b/kube-rbac-proxy/0.log" Oct 07 20:01:36 crc kubenswrapper[4825]: I1007 20:01:36.847697 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-fpkzs_01529960-5bd1-4a4d-8703-8d6a3ff38d4b/manager/0.log" Oct 07 20:01:36 crc kubenswrapper[4825]: I1007 20:01:36.963486 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-vn86z_0caa8db7-d83d-47bd-9276-29102dd20de8/kube-rbac-proxy/0.log" Oct 07 20:01:37 crc kubenswrapper[4825]: I1007 20:01:37.081106 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-vn86z_0caa8db7-d83d-47bd-9276-29102dd20de8/manager/0.log" Oct 07 20:01:37 crc kubenswrapper[4825]: I1007 20:01:37.091280 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-hjgxc_0ecb1a32-2936-470c-a9c5-6701d461cd71/kube-rbac-proxy/0.log" Oct 07 20:01:37 crc kubenswrapper[4825]: I1007 20:01:37.156099 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-hjgxc_0ecb1a32-2936-470c-a9c5-6701d461cd71/manager/0.log" Oct 07 20:01:37 crc kubenswrapper[4825]: I1007 20:01:37.256284 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv_8cacb372-6381-4182-92eb-81e607f7cf31/kube-rbac-proxy/0.log" Oct 07 20:01:37 crc kubenswrapper[4825]: I1007 20:01:37.259316 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv_8cacb372-6381-4182-92eb-81e607f7cf31/manager/0.log" Oct 07 20:01:37 crc kubenswrapper[4825]: I1007 20:01:37.418925 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-77dffbdc98-rxjhm_3463b9a9-3935-4a41-b710-77084296fa18/kube-rbac-proxy/0.log" Oct 07 20:01:37 crc kubenswrapper[4825]: I1007 20:01:37.546572 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6687d89476-w4tpc_0b0eb630-7794-4425-9ada-29b15acb6bdb/kube-rbac-proxy/0.log" Oct 07 20:01:37 crc kubenswrapper[4825]: I1007 20:01:37.763386 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-l89d7_fd7a1b83-b50f-41c1-8092-ce7135ffe155/registry-server/0.log" Oct 07 20:01:37 crc kubenswrapper[4825]: I1007 20:01:37.801801 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6687d89476-w4tpc_0b0eb630-7794-4425-9ada-29b15acb6bdb/operator/0.log" Oct 07 20:01:37 crc kubenswrapper[4825]: I1007 20:01:37.991339 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-9llfs_844cfe74-a770-4268-a60a-372586ac0744/kube-rbac-proxy/0.log" Oct 07 20:01:38 crc kubenswrapper[4825]: I1007 20:01:38.069027 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-9llfs_844cfe74-a770-4268-a60a-372586ac0744/manager/0.log" Oct 07 20:01:38 crc kubenswrapper[4825]: I1007 20:01:38.171691 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-lp8qp_3b8778b6-81a2-4e3c-b464-6e5c8e063a4b/kube-rbac-proxy/0.log" Oct 07 20:01:38 crc kubenswrapper[4825]: I1007 20:01:38.238981 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-lp8qp_3b8778b6-81a2-4e3c-b464-6e5c8e063a4b/manager/0.log" Oct 07 20:01:38 crc kubenswrapper[4825]: I1007 20:01:38.309878 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-rgt49_063bceb1-c26d-453a-a74a-e6874c273034/operator/0.log" Oct 07 20:01:38 crc kubenswrapper[4825]: I1007 20:01:38.458970 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-77dffbdc98-rxjhm_3463b9a9-3935-4a41-b710-77084296fa18/manager/0.log" Oct 07 20:01:38 crc kubenswrapper[4825]: I1007 20:01:38.496140 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-bs76l_c720cafe-11e6-4959-8228-b03cdb65242d/manager/0.log" Oct 07 20:01:38 crc kubenswrapper[4825]: I1007 20:01:38.520036 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-bs76l_c720cafe-11e6-4959-8228-b03cdb65242d/kube-rbac-proxy/0.log" Oct 07 20:01:38 crc kubenswrapper[4825]: I1007 20:01:38.596205 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-jvj5c_7f5bc608-3853-4a58-ac8d-18f57baffe4c/kube-rbac-proxy/0.log" Oct 07 20:01:38 crc kubenswrapper[4825]: I1007 20:01:38.674323 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-jvj5c_7f5bc608-3853-4a58-ac8d-18f57baffe4c/manager/0.log" Oct 07 20:01:38 crc kubenswrapper[4825]: I1007 20:01:38.694333 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-lg8z7_bd6d051a-119d-45c5-9b81-939bba328c56/kube-rbac-proxy/0.log" Oct 07 20:01:38 crc kubenswrapper[4825]: I1007 20:01:38.780345 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-lg8z7_bd6d051a-119d-45c5-9b81-939bba328c56/manager/0.log" Oct 07 20:01:38 crc kubenswrapper[4825]: I1007 20:01:38.825955 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-zbnmp_f99d1a15-090e-4a5e-a210-690be64c4742/kube-rbac-proxy/0.log" Oct 07 20:01:38 crc kubenswrapper[4825]: I1007 20:01:38.897831 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-zbnmp_f99d1a15-090e-4a5e-a210-690be64c4742/manager/0.log" Oct 07 20:01:55 crc kubenswrapper[4825]: I1007 20:01:55.345473 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-mhcpj_962646e1-6f06-40ed-a19a-d73f55b93d95/control-plane-machine-set-operator/0.log" Oct 07 20:01:55 crc kubenswrapper[4825]: I1007 20:01:55.535592 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7hvsq_b3cea192-f8e9-426c-887e-68a8d8f2dad5/machine-api-operator/0.log" Oct 07 20:01:55 crc kubenswrapper[4825]: I1007 20:01:55.558303 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7hvsq_b3cea192-f8e9-426c-887e-68a8d8f2dad5/kube-rbac-proxy/0.log" Oct 07 20:02:09 crc kubenswrapper[4825]: I1007 20:02:09.232340 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-vhchg_1ff41e8d-639e-4710-a863-1c6dbec99768/cert-manager-controller/0.log" Oct 07 20:02:09 crc kubenswrapper[4825]: I1007 20:02:09.245505 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-ff56x_491e6da2-5d0d-4a47-abda-467d60d5ec14/cert-manager-cainjector/0.log" Oct 07 20:02:09 crc kubenswrapper[4825]: I1007 20:02:09.338511 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-ph62w_a95caa53-91d1-4d61-872a-c0ff3539d4d7/cert-manager-webhook/0.log" Oct 07 20:02:22 crc kubenswrapper[4825]: I1007 20:02:22.930138 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-dzb8f_bf7784c9-07ce-45f5-ad97-788cf3ef3b36/nmstate-console-plugin/0.log" Oct 07 20:02:23 crc kubenswrapper[4825]: I1007 20:02:23.098549 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-v8dxh_2e774a23-bfdc-466c-92ed-4a9eb8f74c44/nmstate-handler/0.log" Oct 07 20:02:23 crc kubenswrapper[4825]: I1007 20:02:23.109284 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-vv8pn_fef914bd-768f-4cd2-92c1-b5fb49e63ca8/kube-rbac-proxy/0.log" Oct 07 20:02:23 crc kubenswrapper[4825]: I1007 20:02:23.134331 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-vv8pn_fef914bd-768f-4cd2-92c1-b5fb49e63ca8/nmstate-metrics/0.log" Oct 07 20:02:23 crc kubenswrapper[4825]: I1007 20:02:23.287583 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-8lh75_00238ddf-6ee8-44a7-97a3-7d1563e1a1d7/nmstate-operator/0.log" Oct 07 20:02:23 crc kubenswrapper[4825]: I1007 20:02:23.294931 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-4kjx7_a194e8ec-fa8a-4efb-af70-ea121bb7d835/nmstate-webhook/0.log" Oct 07 20:02:35 crc kubenswrapper[4825]: I1007 20:02:35.709584 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 20:02:35 crc kubenswrapper[4825]: I1007 20:02:35.710010 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 20:02:37 crc kubenswrapper[4825]: I1007 20:02:37.833252 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-6vp5x_5c658b5b-d9ac-4877-894a-770c7fefcf5e/kube-rbac-proxy/0.log" Oct 07 20:02:37 crc kubenswrapper[4825]: I1007 20:02:37.861456 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-6vp5x_5c658b5b-d9ac-4877-894a-770c7fefcf5e/controller/0.log" Oct 07 20:02:37 crc kubenswrapper[4825]: I1007 20:02:37.926910 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/cp-frr-files/0.log" Oct 07 20:02:38 crc kubenswrapper[4825]: I1007 20:02:38.076244 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/cp-frr-files/0.log" Oct 07 20:02:38 crc kubenswrapper[4825]: I1007 20:02:38.094662 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/cp-reloader/0.log" Oct 07 20:02:38 crc kubenswrapper[4825]: I1007 20:02:38.111952 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/cp-metrics/0.log" Oct 07 20:02:38 crc kubenswrapper[4825]: I1007 20:02:38.160895 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/cp-reloader/0.log" Oct 07 20:02:38 crc kubenswrapper[4825]: I1007 20:02:38.329715 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/cp-frr-files/0.log" Oct 07 20:02:38 crc kubenswrapper[4825]: I1007 20:02:38.353462 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/cp-metrics/0.log" Oct 07 20:02:38 crc kubenswrapper[4825]: I1007 20:02:38.366430 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/cp-reloader/0.log" Oct 07 20:02:38 crc kubenswrapper[4825]: I1007 20:02:38.401280 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/cp-metrics/0.log" Oct 07 20:02:38 crc kubenswrapper[4825]: I1007 20:02:38.541757 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/cp-frr-files/0.log" Oct 07 20:02:38 crc kubenswrapper[4825]: I1007 20:02:38.578795 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/cp-reloader/0.log" Oct 07 20:02:38 crc kubenswrapper[4825]: I1007 20:02:38.590400 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/controller/0.log" Oct 07 20:02:38 crc kubenswrapper[4825]: I1007 20:02:38.598171 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/cp-metrics/0.log" Oct 07 20:02:38 crc kubenswrapper[4825]: I1007 20:02:38.748688 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/frr-metrics/0.log" Oct 07 20:02:38 crc kubenswrapper[4825]: I1007 20:02:38.816640 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/kube-rbac-proxy-frr/0.log" Oct 07 20:02:38 crc kubenswrapper[4825]: I1007 20:02:38.852635 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/kube-rbac-proxy/0.log" Oct 07 20:02:38 crc kubenswrapper[4825]: I1007 20:02:38.962799 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/reloader/0.log" Oct 07 20:02:39 crc kubenswrapper[4825]: I1007 20:02:39.055485 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-sd7s7_15d8205a-b357-40f1-813d-e42c9d6ac2f0/frr-k8s-webhook-server/0.log" Oct 07 20:02:39 crc kubenswrapper[4825]: I1007 20:02:39.210887 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-bb67dff7d-fcd7m_90e787f1-6fb5-4827-b024-89aeb27ca750/manager/0.log" Oct 07 20:02:39 crc kubenswrapper[4825]: I1007 20:02:39.363518 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7c9df698c8-5bgs4_51adc395-c4fb-43b7-a152-871a4b65a832/webhook-server/0.log" Oct 07 20:02:39 crc kubenswrapper[4825]: I1007 20:02:39.467985 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-x5hwq_e705456a-fdcd-4d7e-b3e9-0146cf587db8/kube-rbac-proxy/0.log" Oct 07 20:02:39 crc kubenswrapper[4825]: I1007 20:02:39.999315 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-x5hwq_e705456a-fdcd-4d7e-b3e9-0146cf587db8/speaker/0.log" Oct 07 20:02:40 crc kubenswrapper[4825]: I1007 20:02:40.008352 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/frr/0.log" Oct 07 20:02:54 crc kubenswrapper[4825]: I1007 20:02:54.197889 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t_e49fd630-5fe7-4b4a-a455-9f53417191bf/util/0.log" Oct 07 20:02:54 crc kubenswrapper[4825]: I1007 20:02:54.381492 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t_e49fd630-5fe7-4b4a-a455-9f53417191bf/util/0.log" Oct 07 20:02:54 crc kubenswrapper[4825]: I1007 20:02:54.404872 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t_e49fd630-5fe7-4b4a-a455-9f53417191bf/pull/0.log" Oct 07 20:02:54 crc kubenswrapper[4825]: I1007 20:02:54.425563 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t_e49fd630-5fe7-4b4a-a455-9f53417191bf/pull/0.log" Oct 07 20:02:54 crc kubenswrapper[4825]: I1007 20:02:54.572829 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t_e49fd630-5fe7-4b4a-a455-9f53417191bf/pull/0.log" Oct 07 20:02:54 crc kubenswrapper[4825]: I1007 20:02:54.576029 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t_e49fd630-5fe7-4b4a-a455-9f53417191bf/util/0.log" Oct 07 20:02:54 crc kubenswrapper[4825]: I1007 20:02:54.614160 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t_e49fd630-5fe7-4b4a-a455-9f53417191bf/extract/0.log" Oct 07 20:02:54 crc kubenswrapper[4825]: I1007 20:02:54.788715 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mmrrt_d4d9fecf-f52d-4758-9a79-9c80afa25e80/extract-utilities/0.log" Oct 07 20:02:54 crc kubenswrapper[4825]: I1007 20:02:54.973571 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mmrrt_d4d9fecf-f52d-4758-9a79-9c80afa25e80/extract-utilities/0.log" Oct 07 20:02:54 crc kubenswrapper[4825]: I1007 20:02:54.978194 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mmrrt_d4d9fecf-f52d-4758-9a79-9c80afa25e80/extract-content/0.log" Oct 07 20:02:55 crc kubenswrapper[4825]: I1007 20:02:55.029355 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mmrrt_d4d9fecf-f52d-4758-9a79-9c80afa25e80/extract-content/0.log" Oct 07 20:02:55 crc kubenswrapper[4825]: I1007 20:02:55.134129 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mmrrt_d4d9fecf-f52d-4758-9a79-9c80afa25e80/extract-content/0.log" Oct 07 20:02:55 crc kubenswrapper[4825]: I1007 20:02:55.139434 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mmrrt_d4d9fecf-f52d-4758-9a79-9c80afa25e80/extract-utilities/0.log" Oct 07 20:02:55 crc kubenswrapper[4825]: I1007 20:02:55.316677 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gdls5_707f4130-70a2-4161-80c6-d5767bf6752e/extract-utilities/0.log" Oct 07 20:02:55 crc kubenswrapper[4825]: I1007 20:02:55.556040 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gdls5_707f4130-70a2-4161-80c6-d5767bf6752e/extract-utilities/0.log" Oct 07 20:02:55 crc kubenswrapper[4825]: I1007 20:02:55.556075 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gdls5_707f4130-70a2-4161-80c6-d5767bf6752e/extract-content/0.log" Oct 07 20:02:55 crc kubenswrapper[4825]: I1007 20:02:55.591837 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gdls5_707f4130-70a2-4161-80c6-d5767bf6752e/extract-content/0.log" Oct 07 20:02:55 crc kubenswrapper[4825]: I1007 20:02:55.683442 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mmrrt_d4d9fecf-f52d-4758-9a79-9c80afa25e80/registry-server/0.log" Oct 07 20:02:55 crc kubenswrapper[4825]: I1007 20:02:55.788535 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gdls5_707f4130-70a2-4161-80c6-d5767bf6752e/extract-utilities/0.log" Oct 07 20:02:55 crc kubenswrapper[4825]: I1007 20:02:55.826245 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gdls5_707f4130-70a2-4161-80c6-d5767bf6752e/extract-content/0.log" Oct 07 20:02:56 crc kubenswrapper[4825]: I1007 20:02:56.016644 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4_3b2c30cb-8398-4238-a5cd-eb2ee78812a1/util/0.log" Oct 07 20:02:56 crc kubenswrapper[4825]: I1007 20:02:56.116085 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gdls5_707f4130-70a2-4161-80c6-d5767bf6752e/registry-server/0.log" Oct 07 20:02:56 crc kubenswrapper[4825]: I1007 20:02:56.227181 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4_3b2c30cb-8398-4238-a5cd-eb2ee78812a1/pull/0.log" Oct 07 20:02:56 crc kubenswrapper[4825]: I1007 20:02:56.250020 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4_3b2c30cb-8398-4238-a5cd-eb2ee78812a1/pull/0.log" Oct 07 20:02:56 crc kubenswrapper[4825]: I1007 20:02:56.264052 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4_3b2c30cb-8398-4238-a5cd-eb2ee78812a1/util/0.log" Oct 07 20:02:56 crc kubenswrapper[4825]: I1007 20:02:56.400362 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4_3b2c30cb-8398-4238-a5cd-eb2ee78812a1/pull/0.log" Oct 07 20:02:56 crc kubenswrapper[4825]: I1007 20:02:56.431890 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4_3b2c30cb-8398-4238-a5cd-eb2ee78812a1/util/0.log" Oct 07 20:02:56 crc kubenswrapper[4825]: I1007 20:02:56.475913 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4_3b2c30cb-8398-4238-a5cd-eb2ee78812a1/extract/0.log" Oct 07 20:02:56 crc kubenswrapper[4825]: I1007 20:02:56.567335 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-kgrmp_69920aad-eedb-4eca-887a-8f3225bff52b/marketplace-operator/0.log" Oct 07 20:02:56 crc kubenswrapper[4825]: I1007 20:02:56.626198 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jzchl_1a126e8e-4603-49da-a888-c12dba592af6/extract-utilities/0.log" Oct 07 20:02:56 crc kubenswrapper[4825]: I1007 20:02:56.835067 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jzchl_1a126e8e-4603-49da-a888-c12dba592af6/extract-content/0.log" Oct 07 20:02:56 crc kubenswrapper[4825]: I1007 20:02:56.838623 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jzchl_1a126e8e-4603-49da-a888-c12dba592af6/extract-utilities/0.log" Oct 07 20:02:56 crc kubenswrapper[4825]: I1007 20:02:56.846324 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jzchl_1a126e8e-4603-49da-a888-c12dba592af6/extract-content/0.log" Oct 07 20:02:56 crc kubenswrapper[4825]: I1007 20:02:56.995675 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jzchl_1a126e8e-4603-49da-a888-c12dba592af6/extract-utilities/0.log" Oct 07 20:02:57 crc kubenswrapper[4825]: I1007 20:02:57.034364 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jzchl_1a126e8e-4603-49da-a888-c12dba592af6/extract-content/0.log" Oct 07 20:02:57 crc kubenswrapper[4825]: I1007 20:02:57.126863 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jzchl_1a126e8e-4603-49da-a888-c12dba592af6/registry-server/0.log" Oct 07 20:02:57 crc kubenswrapper[4825]: I1007 20:02:57.193859 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xq5h6_cbac88b1-e1d0-432b-a57d-b73910086aa8/extract-utilities/0.log" Oct 07 20:02:57 crc kubenswrapper[4825]: I1007 20:02:57.364965 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xq5h6_cbac88b1-e1d0-432b-a57d-b73910086aa8/extract-content/0.log" Oct 07 20:02:57 crc kubenswrapper[4825]: I1007 20:02:57.368809 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xq5h6_cbac88b1-e1d0-432b-a57d-b73910086aa8/extract-utilities/0.log" Oct 07 20:02:57 crc kubenswrapper[4825]: I1007 20:02:57.394038 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xq5h6_cbac88b1-e1d0-432b-a57d-b73910086aa8/extract-content/0.log" Oct 07 20:02:57 crc kubenswrapper[4825]: I1007 20:02:57.510724 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xq5h6_cbac88b1-e1d0-432b-a57d-b73910086aa8/extract-utilities/0.log" Oct 07 20:02:57 crc kubenswrapper[4825]: I1007 20:02:57.551377 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xq5h6_cbac88b1-e1d0-432b-a57d-b73910086aa8/extract-content/0.log" Oct 07 20:02:58 crc kubenswrapper[4825]: I1007 20:02:58.021529 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xq5h6_cbac88b1-e1d0-432b-a57d-b73910086aa8/registry-server/0.log" Oct 07 20:03:05 crc kubenswrapper[4825]: I1007 20:03:05.708980 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 20:03:05 crc kubenswrapper[4825]: I1007 20:03:05.709723 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 20:03:35 crc kubenswrapper[4825]: I1007 20:03:35.709354 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 20:03:35 crc kubenswrapper[4825]: I1007 20:03:35.709946 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 20:03:35 crc kubenswrapper[4825]: I1007 20:03:35.710021 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 20:03:35 crc kubenswrapper[4825]: I1007 20:03:35.711053 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962"} pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 20:03:35 crc kubenswrapper[4825]: I1007 20:03:35.711147 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" containerID="cri-o://680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" gracePeriod=600 Oct 07 20:03:36 crc kubenswrapper[4825]: E1007 20:03:36.279347 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 20:03:37 crc kubenswrapper[4825]: I1007 20:03:37.099619 4825 generic.go:334] "Generic (PLEG): container finished" podID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" exitCode=0 Oct 07 20:03:37 crc kubenswrapper[4825]: I1007 20:03:37.099691 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerDied","Data":"680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962"} Oct 07 20:03:37 crc kubenswrapper[4825]: I1007 20:03:37.099740 4825 scope.go:117] "RemoveContainer" containerID="746ab6b7314c4e4728ba5c166df7d076205c8ec7f2dab0742da8f3345d6dc112" Oct 07 20:03:37 crc kubenswrapper[4825]: I1007 20:03:37.100637 4825 scope.go:117] "RemoveContainer" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" Oct 07 20:03:37 crc kubenswrapper[4825]: E1007 20:03:37.101197 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 20:03:52 crc kubenswrapper[4825]: I1007 20:03:52.795467 4825 scope.go:117] "RemoveContainer" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" Oct 07 20:03:52 crc kubenswrapper[4825]: E1007 20:03:52.796599 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 20:04:06 crc kubenswrapper[4825]: I1007 20:04:06.666742 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5sv2v"] Oct 07 20:04:06 crc kubenswrapper[4825]: E1007 20:04:06.667768 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="446fe837-d75b-4b37-9d3a-1d6a3a7ce319" containerName="container-00" Oct 07 20:04:06 crc kubenswrapper[4825]: I1007 20:04:06.667784 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="446fe837-d75b-4b37-9d3a-1d6a3a7ce319" containerName="container-00" Oct 07 20:04:06 crc kubenswrapper[4825]: I1007 20:04:06.668049 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="446fe837-d75b-4b37-9d3a-1d6a3a7ce319" containerName="container-00" Oct 07 20:04:06 crc kubenswrapper[4825]: I1007 20:04:06.669891 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5sv2v" Oct 07 20:04:06 crc kubenswrapper[4825]: I1007 20:04:06.680719 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5sv2v"] Oct 07 20:04:06 crc kubenswrapper[4825]: I1007 20:04:06.821543 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c3d2dd8-ec63-42c6-be6a-a301e8049d9b-utilities\") pod \"redhat-operators-5sv2v\" (UID: \"0c3d2dd8-ec63-42c6-be6a-a301e8049d9b\") " pod="openshift-marketplace/redhat-operators-5sv2v" Oct 07 20:04:06 crc kubenswrapper[4825]: I1007 20:04:06.821848 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwkqk\" (UniqueName: \"kubernetes.io/projected/0c3d2dd8-ec63-42c6-be6a-a301e8049d9b-kube-api-access-hwkqk\") pod \"redhat-operators-5sv2v\" (UID: \"0c3d2dd8-ec63-42c6-be6a-a301e8049d9b\") " pod="openshift-marketplace/redhat-operators-5sv2v" Oct 07 20:04:06 crc kubenswrapper[4825]: I1007 20:04:06.821961 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c3d2dd8-ec63-42c6-be6a-a301e8049d9b-catalog-content\") pod \"redhat-operators-5sv2v\" (UID: \"0c3d2dd8-ec63-42c6-be6a-a301e8049d9b\") " pod="openshift-marketplace/redhat-operators-5sv2v" Oct 07 20:04:06 crc kubenswrapper[4825]: I1007 20:04:06.923512 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c3d2dd8-ec63-42c6-be6a-a301e8049d9b-utilities\") pod \"redhat-operators-5sv2v\" (UID: \"0c3d2dd8-ec63-42c6-be6a-a301e8049d9b\") " pod="openshift-marketplace/redhat-operators-5sv2v" Oct 07 20:04:06 crc kubenswrapper[4825]: I1007 20:04:06.923584 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwkqk\" (UniqueName: \"kubernetes.io/projected/0c3d2dd8-ec63-42c6-be6a-a301e8049d9b-kube-api-access-hwkqk\") pod \"redhat-operators-5sv2v\" (UID: \"0c3d2dd8-ec63-42c6-be6a-a301e8049d9b\") " pod="openshift-marketplace/redhat-operators-5sv2v" Oct 07 20:04:06 crc kubenswrapper[4825]: I1007 20:04:06.923620 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c3d2dd8-ec63-42c6-be6a-a301e8049d9b-catalog-content\") pod \"redhat-operators-5sv2v\" (UID: \"0c3d2dd8-ec63-42c6-be6a-a301e8049d9b\") " pod="openshift-marketplace/redhat-operators-5sv2v" Oct 07 20:04:06 crc kubenswrapper[4825]: I1007 20:04:06.924092 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c3d2dd8-ec63-42c6-be6a-a301e8049d9b-catalog-content\") pod \"redhat-operators-5sv2v\" (UID: \"0c3d2dd8-ec63-42c6-be6a-a301e8049d9b\") " pod="openshift-marketplace/redhat-operators-5sv2v" Oct 07 20:04:06 crc kubenswrapper[4825]: I1007 20:04:06.924149 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c3d2dd8-ec63-42c6-be6a-a301e8049d9b-utilities\") pod \"redhat-operators-5sv2v\" (UID: \"0c3d2dd8-ec63-42c6-be6a-a301e8049d9b\") " pod="openshift-marketplace/redhat-operators-5sv2v" Oct 07 20:04:06 crc kubenswrapper[4825]: I1007 20:04:06.942532 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwkqk\" (UniqueName: \"kubernetes.io/projected/0c3d2dd8-ec63-42c6-be6a-a301e8049d9b-kube-api-access-hwkqk\") pod \"redhat-operators-5sv2v\" (UID: \"0c3d2dd8-ec63-42c6-be6a-a301e8049d9b\") " pod="openshift-marketplace/redhat-operators-5sv2v" Oct 07 20:04:07 crc kubenswrapper[4825]: I1007 20:04:07.012505 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5sv2v" Oct 07 20:04:07 crc kubenswrapper[4825]: I1007 20:04:07.473012 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5sv2v"] Oct 07 20:04:07 crc kubenswrapper[4825]: I1007 20:04:07.796075 4825 scope.go:117] "RemoveContainer" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" Oct 07 20:04:07 crc kubenswrapper[4825]: E1007 20:04:07.796448 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 20:04:08 crc kubenswrapper[4825]: I1007 20:04:08.465374 4825 generic.go:334] "Generic (PLEG): container finished" podID="0c3d2dd8-ec63-42c6-be6a-a301e8049d9b" containerID="dcdfa662975a333a69addb43617e57833fe7ee5587570eec5ffc0f52f0266d0e" exitCode=0 Oct 07 20:04:08 crc kubenswrapper[4825]: I1007 20:04:08.465429 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5sv2v" event={"ID":"0c3d2dd8-ec63-42c6-be6a-a301e8049d9b","Type":"ContainerDied","Data":"dcdfa662975a333a69addb43617e57833fe7ee5587570eec5ffc0f52f0266d0e"} Oct 07 20:04:08 crc kubenswrapper[4825]: I1007 20:04:08.465700 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5sv2v" event={"ID":"0c3d2dd8-ec63-42c6-be6a-a301e8049d9b","Type":"ContainerStarted","Data":"69315384ab81c78d9e71c9610720bc98e4ec624fa276bc6b70e21c529d60fdd8"} Oct 07 20:04:08 crc kubenswrapper[4825]: I1007 20:04:08.468724 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 20:04:10 crc kubenswrapper[4825]: I1007 20:04:10.493612 4825 generic.go:334] "Generic (PLEG): container finished" podID="0c3d2dd8-ec63-42c6-be6a-a301e8049d9b" containerID="89f2ee6f99b2f20da165a284e7cfaa9da9caff81a618c76677a181a0fe2d11cb" exitCode=0 Oct 07 20:04:10 crc kubenswrapper[4825]: I1007 20:04:10.493706 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5sv2v" event={"ID":"0c3d2dd8-ec63-42c6-be6a-a301e8049d9b","Type":"ContainerDied","Data":"89f2ee6f99b2f20da165a284e7cfaa9da9caff81a618c76677a181a0fe2d11cb"} Oct 07 20:04:11 crc kubenswrapper[4825]: I1007 20:04:11.525175 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5sv2v" event={"ID":"0c3d2dd8-ec63-42c6-be6a-a301e8049d9b","Type":"ContainerStarted","Data":"1546a104ec05463131c2de2b1a06e14ad3617d97d62ac84193aa1855c4e833c0"} Oct 07 20:04:11 crc kubenswrapper[4825]: I1007 20:04:11.557154 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5sv2v" podStartSLOduration=3.044975904 podStartE2EDuration="5.557131836s" podCreationTimestamp="2025-10-07 20:04:06 +0000 UTC" firstStartedPulling="2025-10-07 20:04:08.4684871 +0000 UTC m=+3837.290525737" lastFinishedPulling="2025-10-07 20:04:10.980643032 +0000 UTC m=+3839.802681669" observedRunningTime="2025-10-07 20:04:11.547505501 +0000 UTC m=+3840.369544148" watchObservedRunningTime="2025-10-07 20:04:11.557131836 +0000 UTC m=+3840.379170483" Oct 07 20:04:17 crc kubenswrapper[4825]: I1007 20:04:17.013487 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5sv2v" Oct 07 20:04:17 crc kubenswrapper[4825]: I1007 20:04:17.013919 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5sv2v" Oct 07 20:04:17 crc kubenswrapper[4825]: I1007 20:04:17.100154 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5sv2v" Oct 07 20:04:17 crc kubenswrapper[4825]: I1007 20:04:17.657681 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5sv2v" Oct 07 20:04:17 crc kubenswrapper[4825]: I1007 20:04:17.717953 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5sv2v"] Oct 07 20:04:19 crc kubenswrapper[4825]: I1007 20:04:19.623820 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5sv2v" podUID="0c3d2dd8-ec63-42c6-be6a-a301e8049d9b" containerName="registry-server" containerID="cri-o://1546a104ec05463131c2de2b1a06e14ad3617d97d62ac84193aa1855c4e833c0" gracePeriod=2 Oct 07 20:04:20 crc kubenswrapper[4825]: I1007 20:04:20.172455 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5sv2v" Oct 07 20:04:20 crc kubenswrapper[4825]: I1007 20:04:20.296931 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c3d2dd8-ec63-42c6-be6a-a301e8049d9b-catalog-content\") pod \"0c3d2dd8-ec63-42c6-be6a-a301e8049d9b\" (UID: \"0c3d2dd8-ec63-42c6-be6a-a301e8049d9b\") " Oct 07 20:04:20 crc kubenswrapper[4825]: I1007 20:04:20.297124 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwkqk\" (UniqueName: \"kubernetes.io/projected/0c3d2dd8-ec63-42c6-be6a-a301e8049d9b-kube-api-access-hwkqk\") pod \"0c3d2dd8-ec63-42c6-be6a-a301e8049d9b\" (UID: \"0c3d2dd8-ec63-42c6-be6a-a301e8049d9b\") " Oct 07 20:04:20 crc kubenswrapper[4825]: I1007 20:04:20.297221 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c3d2dd8-ec63-42c6-be6a-a301e8049d9b-utilities\") pod \"0c3d2dd8-ec63-42c6-be6a-a301e8049d9b\" (UID: \"0c3d2dd8-ec63-42c6-be6a-a301e8049d9b\") " Oct 07 20:04:20 crc kubenswrapper[4825]: I1007 20:04:20.299554 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c3d2dd8-ec63-42c6-be6a-a301e8049d9b-utilities" (OuterVolumeSpecName: "utilities") pod "0c3d2dd8-ec63-42c6-be6a-a301e8049d9b" (UID: "0c3d2dd8-ec63-42c6-be6a-a301e8049d9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 20:04:20 crc kubenswrapper[4825]: I1007 20:04:20.319701 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c3d2dd8-ec63-42c6-be6a-a301e8049d9b-kube-api-access-hwkqk" (OuterVolumeSpecName: "kube-api-access-hwkqk") pod "0c3d2dd8-ec63-42c6-be6a-a301e8049d9b" (UID: "0c3d2dd8-ec63-42c6-be6a-a301e8049d9b"). InnerVolumeSpecName "kube-api-access-hwkqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 20:04:20 crc kubenswrapper[4825]: I1007 20:04:20.399600 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c3d2dd8-ec63-42c6-be6a-a301e8049d9b-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 20:04:20 crc kubenswrapper[4825]: I1007 20:04:20.399642 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwkqk\" (UniqueName: \"kubernetes.io/projected/0c3d2dd8-ec63-42c6-be6a-a301e8049d9b-kube-api-access-hwkqk\") on node \"crc\" DevicePath \"\"" Oct 07 20:04:20 crc kubenswrapper[4825]: I1007 20:04:20.636159 4825 generic.go:334] "Generic (PLEG): container finished" podID="0c3d2dd8-ec63-42c6-be6a-a301e8049d9b" containerID="1546a104ec05463131c2de2b1a06e14ad3617d97d62ac84193aa1855c4e833c0" exitCode=0 Oct 07 20:04:20 crc kubenswrapper[4825]: I1007 20:04:20.636220 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5sv2v" event={"ID":"0c3d2dd8-ec63-42c6-be6a-a301e8049d9b","Type":"ContainerDied","Data":"1546a104ec05463131c2de2b1a06e14ad3617d97d62ac84193aa1855c4e833c0"} Oct 07 20:04:20 crc kubenswrapper[4825]: I1007 20:04:20.636255 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5sv2v" Oct 07 20:04:20 crc kubenswrapper[4825]: I1007 20:04:20.636271 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5sv2v" event={"ID":"0c3d2dd8-ec63-42c6-be6a-a301e8049d9b","Type":"ContainerDied","Data":"69315384ab81c78d9e71c9610720bc98e4ec624fa276bc6b70e21c529d60fdd8"} Oct 07 20:04:20 crc kubenswrapper[4825]: I1007 20:04:20.636288 4825 scope.go:117] "RemoveContainer" containerID="1546a104ec05463131c2de2b1a06e14ad3617d97d62ac84193aa1855c4e833c0" Oct 07 20:04:20 crc kubenswrapper[4825]: I1007 20:04:20.673264 4825 scope.go:117] "RemoveContainer" containerID="89f2ee6f99b2f20da165a284e7cfaa9da9caff81a618c76677a181a0fe2d11cb" Oct 07 20:04:20 crc kubenswrapper[4825]: I1007 20:04:20.696189 4825 scope.go:117] "RemoveContainer" containerID="dcdfa662975a333a69addb43617e57833fe7ee5587570eec5ffc0f52f0266d0e" Oct 07 20:04:20 crc kubenswrapper[4825]: I1007 20:04:20.760695 4825 scope.go:117] "RemoveContainer" containerID="1546a104ec05463131c2de2b1a06e14ad3617d97d62ac84193aa1855c4e833c0" Oct 07 20:04:20 crc kubenswrapper[4825]: E1007 20:04:20.761333 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1546a104ec05463131c2de2b1a06e14ad3617d97d62ac84193aa1855c4e833c0\": container with ID starting with 1546a104ec05463131c2de2b1a06e14ad3617d97d62ac84193aa1855c4e833c0 not found: ID does not exist" containerID="1546a104ec05463131c2de2b1a06e14ad3617d97d62ac84193aa1855c4e833c0" Oct 07 20:04:20 crc kubenswrapper[4825]: I1007 20:04:20.761383 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1546a104ec05463131c2de2b1a06e14ad3617d97d62ac84193aa1855c4e833c0"} err="failed to get container status \"1546a104ec05463131c2de2b1a06e14ad3617d97d62ac84193aa1855c4e833c0\": rpc error: code = NotFound desc = could not find container \"1546a104ec05463131c2de2b1a06e14ad3617d97d62ac84193aa1855c4e833c0\": container with ID starting with 1546a104ec05463131c2de2b1a06e14ad3617d97d62ac84193aa1855c4e833c0 not found: ID does not exist" Oct 07 20:04:20 crc kubenswrapper[4825]: I1007 20:04:20.761417 4825 scope.go:117] "RemoveContainer" containerID="89f2ee6f99b2f20da165a284e7cfaa9da9caff81a618c76677a181a0fe2d11cb" Oct 07 20:04:20 crc kubenswrapper[4825]: E1007 20:04:20.761890 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89f2ee6f99b2f20da165a284e7cfaa9da9caff81a618c76677a181a0fe2d11cb\": container with ID starting with 89f2ee6f99b2f20da165a284e7cfaa9da9caff81a618c76677a181a0fe2d11cb not found: ID does not exist" containerID="89f2ee6f99b2f20da165a284e7cfaa9da9caff81a618c76677a181a0fe2d11cb" Oct 07 20:04:20 crc kubenswrapper[4825]: I1007 20:04:20.761975 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89f2ee6f99b2f20da165a284e7cfaa9da9caff81a618c76677a181a0fe2d11cb"} err="failed to get container status \"89f2ee6f99b2f20da165a284e7cfaa9da9caff81a618c76677a181a0fe2d11cb\": rpc error: code = NotFound desc = could not find container \"89f2ee6f99b2f20da165a284e7cfaa9da9caff81a618c76677a181a0fe2d11cb\": container with ID starting with 89f2ee6f99b2f20da165a284e7cfaa9da9caff81a618c76677a181a0fe2d11cb not found: ID does not exist" Oct 07 20:04:20 crc kubenswrapper[4825]: I1007 20:04:20.762029 4825 scope.go:117] "RemoveContainer" containerID="dcdfa662975a333a69addb43617e57833fe7ee5587570eec5ffc0f52f0266d0e" Oct 07 20:04:20 crc kubenswrapper[4825]: E1007 20:04:20.762834 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcdfa662975a333a69addb43617e57833fe7ee5587570eec5ffc0f52f0266d0e\": container with ID starting with dcdfa662975a333a69addb43617e57833fe7ee5587570eec5ffc0f52f0266d0e not found: ID does not exist" containerID="dcdfa662975a333a69addb43617e57833fe7ee5587570eec5ffc0f52f0266d0e" Oct 07 20:04:20 crc kubenswrapper[4825]: I1007 20:04:20.762873 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcdfa662975a333a69addb43617e57833fe7ee5587570eec5ffc0f52f0266d0e"} err="failed to get container status \"dcdfa662975a333a69addb43617e57833fe7ee5587570eec5ffc0f52f0266d0e\": rpc error: code = NotFound desc = could not find container \"dcdfa662975a333a69addb43617e57833fe7ee5587570eec5ffc0f52f0266d0e\": container with ID starting with dcdfa662975a333a69addb43617e57833fe7ee5587570eec5ffc0f52f0266d0e not found: ID does not exist" Oct 07 20:04:21 crc kubenswrapper[4825]: I1007 20:04:21.286101 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c3d2dd8-ec63-42c6-be6a-a301e8049d9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c3d2dd8-ec63-42c6-be6a-a301e8049d9b" (UID: "0c3d2dd8-ec63-42c6-be6a-a301e8049d9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 20:04:21 crc kubenswrapper[4825]: I1007 20:04:21.319735 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c3d2dd8-ec63-42c6-be6a-a301e8049d9b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 20:04:21 crc kubenswrapper[4825]: I1007 20:04:21.586829 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5sv2v"] Oct 07 20:04:21 crc kubenswrapper[4825]: I1007 20:04:21.597769 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5sv2v"] Oct 07 20:04:21 crc kubenswrapper[4825]: I1007 20:04:21.802529 4825 scope.go:117] "RemoveContainer" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" Oct 07 20:04:21 crc kubenswrapper[4825]: E1007 20:04:21.802794 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 20:04:21 crc kubenswrapper[4825]: I1007 20:04:21.807868 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c3d2dd8-ec63-42c6-be6a-a301e8049d9b" path="/var/lib/kubelet/pods/0c3d2dd8-ec63-42c6-be6a-a301e8049d9b/volumes" Oct 07 20:04:36 crc kubenswrapper[4825]: I1007 20:04:36.795909 4825 scope.go:117] "RemoveContainer" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" Oct 07 20:04:36 crc kubenswrapper[4825]: E1007 20:04:36.796740 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 20:04:49 crc kubenswrapper[4825]: I1007 20:04:49.796158 4825 scope.go:117] "RemoveContainer" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" Oct 07 20:04:49 crc kubenswrapper[4825]: E1007 20:04:49.797145 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 20:04:51 crc kubenswrapper[4825]: I1007 20:04:51.011320 4825 generic.go:334] "Generic (PLEG): container finished" podID="4eafd191-a4e0-46a6-807e-810e66ef4eec" containerID="3d34aa3e799fe2e67894188bcdb796621c86ac7a2f62faacbf5339bde7cd6974" exitCode=0 Oct 07 20:04:51 crc kubenswrapper[4825]: I1007 20:04:51.011422 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-phcqg/must-gather-r7lmv" event={"ID":"4eafd191-a4e0-46a6-807e-810e66ef4eec","Type":"ContainerDied","Data":"3d34aa3e799fe2e67894188bcdb796621c86ac7a2f62faacbf5339bde7cd6974"} Oct 07 20:04:51 crc kubenswrapper[4825]: I1007 20:04:51.012547 4825 scope.go:117] "RemoveContainer" containerID="3d34aa3e799fe2e67894188bcdb796621c86ac7a2f62faacbf5339bde7cd6974" Oct 07 20:04:51 crc kubenswrapper[4825]: I1007 20:04:51.225708 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-phcqg_must-gather-r7lmv_4eafd191-a4e0-46a6-807e-810e66ef4eec/gather/0.log" Oct 07 20:04:58 crc kubenswrapper[4825]: I1007 20:04:58.823629 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-phcqg/must-gather-r7lmv"] Oct 07 20:04:58 crc kubenswrapper[4825]: I1007 20:04:58.827083 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-phcqg/must-gather-r7lmv" podUID="4eafd191-a4e0-46a6-807e-810e66ef4eec" containerName="copy" containerID="cri-o://a4f7e12674486d6146e146468603cb13388cacb44168543673a82b71fdf820bd" gracePeriod=2 Oct 07 20:04:58 crc kubenswrapper[4825]: I1007 20:04:58.858996 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-phcqg/must-gather-r7lmv"] Oct 07 20:04:59 crc kubenswrapper[4825]: I1007 20:04:59.106086 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-phcqg_must-gather-r7lmv_4eafd191-a4e0-46a6-807e-810e66ef4eec/copy/0.log" Oct 07 20:04:59 crc kubenswrapper[4825]: I1007 20:04:59.106762 4825 generic.go:334] "Generic (PLEG): container finished" podID="4eafd191-a4e0-46a6-807e-810e66ef4eec" containerID="a4f7e12674486d6146e146468603cb13388cacb44168543673a82b71fdf820bd" exitCode=143 Oct 07 20:04:59 crc kubenswrapper[4825]: I1007 20:04:59.335402 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-phcqg_must-gather-r7lmv_4eafd191-a4e0-46a6-807e-810e66ef4eec/copy/0.log" Oct 07 20:04:59 crc kubenswrapper[4825]: I1007 20:04:59.335816 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-phcqg/must-gather-r7lmv" Oct 07 20:04:59 crc kubenswrapper[4825]: I1007 20:04:59.509147 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjq79\" (UniqueName: \"kubernetes.io/projected/4eafd191-a4e0-46a6-807e-810e66ef4eec-kube-api-access-zjq79\") pod \"4eafd191-a4e0-46a6-807e-810e66ef4eec\" (UID: \"4eafd191-a4e0-46a6-807e-810e66ef4eec\") " Oct 07 20:04:59 crc kubenswrapper[4825]: I1007 20:04:59.509513 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4eafd191-a4e0-46a6-807e-810e66ef4eec-must-gather-output\") pod \"4eafd191-a4e0-46a6-807e-810e66ef4eec\" (UID: \"4eafd191-a4e0-46a6-807e-810e66ef4eec\") " Oct 07 20:04:59 crc kubenswrapper[4825]: I1007 20:04:59.517058 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eafd191-a4e0-46a6-807e-810e66ef4eec-kube-api-access-zjq79" (OuterVolumeSpecName: "kube-api-access-zjq79") pod "4eafd191-a4e0-46a6-807e-810e66ef4eec" (UID: "4eafd191-a4e0-46a6-807e-810e66ef4eec"). InnerVolumeSpecName "kube-api-access-zjq79". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 20:04:59 crc kubenswrapper[4825]: I1007 20:04:59.612055 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjq79\" (UniqueName: \"kubernetes.io/projected/4eafd191-a4e0-46a6-807e-810e66ef4eec-kube-api-access-zjq79\") on node \"crc\" DevicePath \"\"" Oct 07 20:04:59 crc kubenswrapper[4825]: I1007 20:04:59.691424 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eafd191-a4e0-46a6-807e-810e66ef4eec-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "4eafd191-a4e0-46a6-807e-810e66ef4eec" (UID: "4eafd191-a4e0-46a6-807e-810e66ef4eec"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 20:04:59 crc kubenswrapper[4825]: I1007 20:04:59.714466 4825 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4eafd191-a4e0-46a6-807e-810e66ef4eec-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 07 20:04:59 crc kubenswrapper[4825]: I1007 20:04:59.806874 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eafd191-a4e0-46a6-807e-810e66ef4eec" path="/var/lib/kubelet/pods/4eafd191-a4e0-46a6-807e-810e66ef4eec/volumes" Oct 07 20:05:00 crc kubenswrapper[4825]: I1007 20:05:00.116770 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-phcqg_must-gather-r7lmv_4eafd191-a4e0-46a6-807e-810e66ef4eec/copy/0.log" Oct 07 20:05:00 crc kubenswrapper[4825]: I1007 20:05:00.118439 4825 scope.go:117] "RemoveContainer" containerID="a4f7e12674486d6146e146468603cb13388cacb44168543673a82b71fdf820bd" Oct 07 20:05:00 crc kubenswrapper[4825]: I1007 20:05:00.118501 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-phcqg/must-gather-r7lmv" Oct 07 20:05:00 crc kubenswrapper[4825]: I1007 20:05:00.147586 4825 scope.go:117] "RemoveContainer" containerID="3d34aa3e799fe2e67894188bcdb796621c86ac7a2f62faacbf5339bde7cd6974" Oct 07 20:05:02 crc kubenswrapper[4825]: I1007 20:05:02.795179 4825 scope.go:117] "RemoveContainer" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" Oct 07 20:05:02 crc kubenswrapper[4825]: E1007 20:05:02.795923 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 20:05:17 crc kubenswrapper[4825]: I1007 20:05:17.796333 4825 scope.go:117] "RemoveContainer" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" Oct 07 20:05:17 crc kubenswrapper[4825]: E1007 20:05:17.797222 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 20:05:28 crc kubenswrapper[4825]: I1007 20:05:28.731324 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2s94g"] Oct 07 20:05:28 crc kubenswrapper[4825]: E1007 20:05:28.733162 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eafd191-a4e0-46a6-807e-810e66ef4eec" containerName="gather" Oct 07 20:05:28 crc kubenswrapper[4825]: I1007 20:05:28.733192 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eafd191-a4e0-46a6-807e-810e66ef4eec" containerName="gather" Oct 07 20:05:28 crc kubenswrapper[4825]: E1007 20:05:28.733221 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eafd191-a4e0-46a6-807e-810e66ef4eec" containerName="copy" Oct 07 20:05:28 crc kubenswrapper[4825]: I1007 20:05:28.733257 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eafd191-a4e0-46a6-807e-810e66ef4eec" containerName="copy" Oct 07 20:05:28 crc kubenswrapper[4825]: E1007 20:05:28.733291 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c3d2dd8-ec63-42c6-be6a-a301e8049d9b" containerName="extract-content" Oct 07 20:05:28 crc kubenswrapper[4825]: I1007 20:05:28.733300 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c3d2dd8-ec63-42c6-be6a-a301e8049d9b" containerName="extract-content" Oct 07 20:05:28 crc kubenswrapper[4825]: E1007 20:05:28.733320 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c3d2dd8-ec63-42c6-be6a-a301e8049d9b" containerName="registry-server" Oct 07 20:05:28 crc kubenswrapper[4825]: I1007 20:05:28.733328 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c3d2dd8-ec63-42c6-be6a-a301e8049d9b" containerName="registry-server" Oct 07 20:05:28 crc kubenswrapper[4825]: E1007 20:05:28.733347 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c3d2dd8-ec63-42c6-be6a-a301e8049d9b" containerName="extract-utilities" Oct 07 20:05:28 crc kubenswrapper[4825]: I1007 20:05:28.733358 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c3d2dd8-ec63-42c6-be6a-a301e8049d9b" containerName="extract-utilities" Oct 07 20:05:28 crc kubenswrapper[4825]: I1007 20:05:28.733656 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eafd191-a4e0-46a6-807e-810e66ef4eec" containerName="copy" Oct 07 20:05:28 crc kubenswrapper[4825]: I1007 20:05:28.733678 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c3d2dd8-ec63-42c6-be6a-a301e8049d9b" containerName="registry-server" Oct 07 20:05:28 crc kubenswrapper[4825]: I1007 20:05:28.733702 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eafd191-a4e0-46a6-807e-810e66ef4eec" containerName="gather" Oct 07 20:05:28 crc kubenswrapper[4825]: I1007 20:05:28.737075 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2s94g" Oct 07 20:05:28 crc kubenswrapper[4825]: I1007 20:05:28.755861 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2s94g"] Oct 07 20:05:28 crc kubenswrapper[4825]: I1007 20:05:28.892197 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbd506ae-baef-4cae-81c3-04882bf64064-catalog-content\") pod \"certified-operators-2s94g\" (UID: \"cbd506ae-baef-4cae-81c3-04882bf64064\") " pod="openshift-marketplace/certified-operators-2s94g" Oct 07 20:05:28 crc kubenswrapper[4825]: I1007 20:05:28.892493 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvklv\" (UniqueName: \"kubernetes.io/projected/cbd506ae-baef-4cae-81c3-04882bf64064-kube-api-access-nvklv\") pod \"certified-operators-2s94g\" (UID: \"cbd506ae-baef-4cae-81c3-04882bf64064\") " pod="openshift-marketplace/certified-operators-2s94g" Oct 07 20:05:28 crc kubenswrapper[4825]: I1007 20:05:28.892775 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbd506ae-baef-4cae-81c3-04882bf64064-utilities\") pod \"certified-operators-2s94g\" (UID: \"cbd506ae-baef-4cae-81c3-04882bf64064\") " pod="openshift-marketplace/certified-operators-2s94g" Oct 07 20:05:28 crc kubenswrapper[4825]: I1007 20:05:28.994168 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbd506ae-baef-4cae-81c3-04882bf64064-catalog-content\") pod \"certified-operators-2s94g\" (UID: \"cbd506ae-baef-4cae-81c3-04882bf64064\") " pod="openshift-marketplace/certified-operators-2s94g" Oct 07 20:05:28 crc kubenswrapper[4825]: I1007 20:05:28.994359 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvklv\" (UniqueName: \"kubernetes.io/projected/cbd506ae-baef-4cae-81c3-04882bf64064-kube-api-access-nvklv\") pod \"certified-operators-2s94g\" (UID: \"cbd506ae-baef-4cae-81c3-04882bf64064\") " pod="openshift-marketplace/certified-operators-2s94g" Oct 07 20:05:28 crc kubenswrapper[4825]: I1007 20:05:28.994426 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbd506ae-baef-4cae-81c3-04882bf64064-utilities\") pod \"certified-operators-2s94g\" (UID: \"cbd506ae-baef-4cae-81c3-04882bf64064\") " pod="openshift-marketplace/certified-operators-2s94g" Oct 07 20:05:28 crc kubenswrapper[4825]: I1007 20:05:28.995278 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbd506ae-baef-4cae-81c3-04882bf64064-catalog-content\") pod \"certified-operators-2s94g\" (UID: \"cbd506ae-baef-4cae-81c3-04882bf64064\") " pod="openshift-marketplace/certified-operators-2s94g" Oct 07 20:05:28 crc kubenswrapper[4825]: I1007 20:05:28.995777 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbd506ae-baef-4cae-81c3-04882bf64064-utilities\") pod \"certified-operators-2s94g\" (UID: \"cbd506ae-baef-4cae-81c3-04882bf64064\") " pod="openshift-marketplace/certified-operators-2s94g" Oct 07 20:05:29 crc kubenswrapper[4825]: I1007 20:05:29.022198 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvklv\" (UniqueName: \"kubernetes.io/projected/cbd506ae-baef-4cae-81c3-04882bf64064-kube-api-access-nvklv\") pod \"certified-operators-2s94g\" (UID: \"cbd506ae-baef-4cae-81c3-04882bf64064\") " pod="openshift-marketplace/certified-operators-2s94g" Oct 07 20:05:29 crc kubenswrapper[4825]: I1007 20:05:29.082793 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2s94g" Oct 07 20:05:29 crc kubenswrapper[4825]: I1007 20:05:29.621681 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2s94g"] Oct 07 20:05:30 crc kubenswrapper[4825]: I1007 20:05:30.549353 4825 generic.go:334] "Generic (PLEG): container finished" podID="cbd506ae-baef-4cae-81c3-04882bf64064" containerID="0a6b72b6290873335eef35900cc6e685baae44027cd21db0758ea83a7ec2d0a7" exitCode=0 Oct 07 20:05:30 crc kubenswrapper[4825]: I1007 20:05:30.549451 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2s94g" event={"ID":"cbd506ae-baef-4cae-81c3-04882bf64064","Type":"ContainerDied","Data":"0a6b72b6290873335eef35900cc6e685baae44027cd21db0758ea83a7ec2d0a7"} Oct 07 20:05:30 crc kubenswrapper[4825]: I1007 20:05:30.549818 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2s94g" event={"ID":"cbd506ae-baef-4cae-81c3-04882bf64064","Type":"ContainerStarted","Data":"d7d35bdf23ce2e92aad9b349aea663f6778acdb02f5c527dbc85f4a11cb3ee68"} Oct 07 20:05:32 crc kubenswrapper[4825]: I1007 20:05:32.579897 4825 generic.go:334] "Generic (PLEG): container finished" podID="cbd506ae-baef-4cae-81c3-04882bf64064" containerID="e0d6335b584909f66648ddc63e4742f1c5f3deae1b7362411f433cc77d5b1c11" exitCode=0 Oct 07 20:05:32 crc kubenswrapper[4825]: I1007 20:05:32.580047 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2s94g" event={"ID":"cbd506ae-baef-4cae-81c3-04882bf64064","Type":"ContainerDied","Data":"e0d6335b584909f66648ddc63e4742f1c5f3deae1b7362411f433cc77d5b1c11"} Oct 07 20:05:32 crc kubenswrapper[4825]: I1007 20:05:32.798085 4825 scope.go:117] "RemoveContainer" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" Oct 07 20:05:32 crc kubenswrapper[4825]: E1007 20:05:32.798802 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 20:05:33 crc kubenswrapper[4825]: I1007 20:05:33.591892 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2s94g" event={"ID":"cbd506ae-baef-4cae-81c3-04882bf64064","Type":"ContainerStarted","Data":"006fe6b65b3b16d5cff557da6c4bb6d263606f5dc7b8474f048aaa0a4f294ad2"} Oct 07 20:05:33 crc kubenswrapper[4825]: I1007 20:05:33.623398 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2s94g" podStartSLOduration=3.106603283 podStartE2EDuration="5.623372652s" podCreationTimestamp="2025-10-07 20:05:28 +0000 UTC" firstStartedPulling="2025-10-07 20:05:30.552439998 +0000 UTC m=+3919.374478675" lastFinishedPulling="2025-10-07 20:05:33.069209367 +0000 UTC m=+3921.891248044" observedRunningTime="2025-10-07 20:05:33.619183328 +0000 UTC m=+3922.441221965" watchObservedRunningTime="2025-10-07 20:05:33.623372652 +0000 UTC m=+3922.445411299" Oct 07 20:05:39 crc kubenswrapper[4825]: I1007 20:05:39.083639 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2s94g" Oct 07 20:05:39 crc kubenswrapper[4825]: I1007 20:05:39.084356 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2s94g" Oct 07 20:05:39 crc kubenswrapper[4825]: I1007 20:05:39.163401 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2s94g" Oct 07 20:05:39 crc kubenswrapper[4825]: I1007 20:05:39.751060 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2s94g" Oct 07 20:05:39 crc kubenswrapper[4825]: I1007 20:05:39.835195 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2s94g"] Oct 07 20:05:41 crc kubenswrapper[4825]: I1007 20:05:41.702627 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2s94g" podUID="cbd506ae-baef-4cae-81c3-04882bf64064" containerName="registry-server" containerID="cri-o://006fe6b65b3b16d5cff557da6c4bb6d263606f5dc7b8474f048aaa0a4f294ad2" gracePeriod=2 Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.242918 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2s94g" Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.389029 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvklv\" (UniqueName: \"kubernetes.io/projected/cbd506ae-baef-4cae-81c3-04882bf64064-kube-api-access-nvklv\") pod \"cbd506ae-baef-4cae-81c3-04882bf64064\" (UID: \"cbd506ae-baef-4cae-81c3-04882bf64064\") " Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.389108 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbd506ae-baef-4cae-81c3-04882bf64064-catalog-content\") pod \"cbd506ae-baef-4cae-81c3-04882bf64064\" (UID: \"cbd506ae-baef-4cae-81c3-04882bf64064\") " Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.389217 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbd506ae-baef-4cae-81c3-04882bf64064-utilities\") pod \"cbd506ae-baef-4cae-81c3-04882bf64064\" (UID: \"cbd506ae-baef-4cae-81c3-04882bf64064\") " Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.391295 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbd506ae-baef-4cae-81c3-04882bf64064-utilities" (OuterVolumeSpecName: "utilities") pod "cbd506ae-baef-4cae-81c3-04882bf64064" (UID: "cbd506ae-baef-4cae-81c3-04882bf64064"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.398472 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbd506ae-baef-4cae-81c3-04882bf64064-kube-api-access-nvklv" (OuterVolumeSpecName: "kube-api-access-nvklv") pod "cbd506ae-baef-4cae-81c3-04882bf64064" (UID: "cbd506ae-baef-4cae-81c3-04882bf64064"). InnerVolumeSpecName "kube-api-access-nvklv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.452863 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbd506ae-baef-4cae-81c3-04882bf64064-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cbd506ae-baef-4cae-81c3-04882bf64064" (UID: "cbd506ae-baef-4cae-81c3-04882bf64064"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.491550 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbd506ae-baef-4cae-81c3-04882bf64064-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.491598 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbd506ae-baef-4cae-81c3-04882bf64064-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.491607 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvklv\" (UniqueName: \"kubernetes.io/projected/cbd506ae-baef-4cae-81c3-04882bf64064-kube-api-access-nvklv\") on node \"crc\" DevicePath \"\"" Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.716544 4825 generic.go:334] "Generic (PLEG): container finished" podID="cbd506ae-baef-4cae-81c3-04882bf64064" containerID="006fe6b65b3b16d5cff557da6c4bb6d263606f5dc7b8474f048aaa0a4f294ad2" exitCode=0 Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.716607 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2s94g" event={"ID":"cbd506ae-baef-4cae-81c3-04882bf64064","Type":"ContainerDied","Data":"006fe6b65b3b16d5cff557da6c4bb6d263606f5dc7b8474f048aaa0a4f294ad2"} Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.716637 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2s94g" Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.716664 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2s94g" event={"ID":"cbd506ae-baef-4cae-81c3-04882bf64064","Type":"ContainerDied","Data":"d7d35bdf23ce2e92aad9b349aea663f6778acdb02f5c527dbc85f4a11cb3ee68"} Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.716714 4825 scope.go:117] "RemoveContainer" containerID="006fe6b65b3b16d5cff557da6c4bb6d263606f5dc7b8474f048aaa0a4f294ad2" Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.739766 4825 scope.go:117] "RemoveContainer" containerID="e0d6335b584909f66648ddc63e4742f1c5f3deae1b7362411f433cc77d5b1c11" Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.772134 4825 scope.go:117] "RemoveContainer" containerID="0a6b72b6290873335eef35900cc6e685baae44027cd21db0758ea83a7ec2d0a7" Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.802635 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2s94g"] Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.818406 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2s94g"] Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.859517 4825 scope.go:117] "RemoveContainer" containerID="006fe6b65b3b16d5cff557da6c4bb6d263606f5dc7b8474f048aaa0a4f294ad2" Oct 07 20:05:42 crc kubenswrapper[4825]: E1007 20:05:42.860163 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"006fe6b65b3b16d5cff557da6c4bb6d263606f5dc7b8474f048aaa0a4f294ad2\": container with ID starting with 006fe6b65b3b16d5cff557da6c4bb6d263606f5dc7b8474f048aaa0a4f294ad2 not found: ID does not exist" containerID="006fe6b65b3b16d5cff557da6c4bb6d263606f5dc7b8474f048aaa0a4f294ad2" Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.860521 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"006fe6b65b3b16d5cff557da6c4bb6d263606f5dc7b8474f048aaa0a4f294ad2"} err="failed to get container status \"006fe6b65b3b16d5cff557da6c4bb6d263606f5dc7b8474f048aaa0a4f294ad2\": rpc error: code = NotFound desc = could not find container \"006fe6b65b3b16d5cff557da6c4bb6d263606f5dc7b8474f048aaa0a4f294ad2\": container with ID starting with 006fe6b65b3b16d5cff557da6c4bb6d263606f5dc7b8474f048aaa0a4f294ad2 not found: ID does not exist" Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.860609 4825 scope.go:117] "RemoveContainer" containerID="e0d6335b584909f66648ddc63e4742f1c5f3deae1b7362411f433cc77d5b1c11" Oct 07 20:05:42 crc kubenswrapper[4825]: E1007 20:05:42.861452 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0d6335b584909f66648ddc63e4742f1c5f3deae1b7362411f433cc77d5b1c11\": container with ID starting with e0d6335b584909f66648ddc63e4742f1c5f3deae1b7362411f433cc77d5b1c11 not found: ID does not exist" containerID="e0d6335b584909f66648ddc63e4742f1c5f3deae1b7362411f433cc77d5b1c11" Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.861529 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0d6335b584909f66648ddc63e4742f1c5f3deae1b7362411f433cc77d5b1c11"} err="failed to get container status \"e0d6335b584909f66648ddc63e4742f1c5f3deae1b7362411f433cc77d5b1c11\": rpc error: code = NotFound desc = could not find container \"e0d6335b584909f66648ddc63e4742f1c5f3deae1b7362411f433cc77d5b1c11\": container with ID starting with e0d6335b584909f66648ddc63e4742f1c5f3deae1b7362411f433cc77d5b1c11 not found: ID does not exist" Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.861556 4825 scope.go:117] "RemoveContainer" containerID="0a6b72b6290873335eef35900cc6e685baae44027cd21db0758ea83a7ec2d0a7" Oct 07 20:05:42 crc kubenswrapper[4825]: E1007 20:05:42.861950 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a6b72b6290873335eef35900cc6e685baae44027cd21db0758ea83a7ec2d0a7\": container with ID starting with 0a6b72b6290873335eef35900cc6e685baae44027cd21db0758ea83a7ec2d0a7 not found: ID does not exist" containerID="0a6b72b6290873335eef35900cc6e685baae44027cd21db0758ea83a7ec2d0a7" Oct 07 20:05:42 crc kubenswrapper[4825]: I1007 20:05:42.861987 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a6b72b6290873335eef35900cc6e685baae44027cd21db0758ea83a7ec2d0a7"} err="failed to get container status \"0a6b72b6290873335eef35900cc6e685baae44027cd21db0758ea83a7ec2d0a7\": rpc error: code = NotFound desc = could not find container \"0a6b72b6290873335eef35900cc6e685baae44027cd21db0758ea83a7ec2d0a7\": container with ID starting with 0a6b72b6290873335eef35900cc6e685baae44027cd21db0758ea83a7ec2d0a7 not found: ID does not exist" Oct 07 20:05:43 crc kubenswrapper[4825]: I1007 20:05:43.821547 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbd506ae-baef-4cae-81c3-04882bf64064" path="/var/lib/kubelet/pods/cbd506ae-baef-4cae-81c3-04882bf64064/volumes" Oct 07 20:05:44 crc kubenswrapper[4825]: I1007 20:05:44.796033 4825 scope.go:117] "RemoveContainer" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" Oct 07 20:05:44 crc kubenswrapper[4825]: E1007 20:05:44.796590 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 20:05:45 crc kubenswrapper[4825]: I1007 20:05:45.301214 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-m2kj9/must-gather-6qf5b"] Oct 07 20:05:45 crc kubenswrapper[4825]: E1007 20:05:45.301936 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd506ae-baef-4cae-81c3-04882bf64064" containerName="extract-utilities" Oct 07 20:05:45 crc kubenswrapper[4825]: I1007 20:05:45.301947 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd506ae-baef-4cae-81c3-04882bf64064" containerName="extract-utilities" Oct 07 20:05:45 crc kubenswrapper[4825]: E1007 20:05:45.301963 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd506ae-baef-4cae-81c3-04882bf64064" containerName="registry-server" Oct 07 20:05:45 crc kubenswrapper[4825]: I1007 20:05:45.301969 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd506ae-baef-4cae-81c3-04882bf64064" containerName="registry-server" Oct 07 20:05:45 crc kubenswrapper[4825]: E1007 20:05:45.301985 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd506ae-baef-4cae-81c3-04882bf64064" containerName="extract-content" Oct 07 20:05:45 crc kubenswrapper[4825]: I1007 20:05:45.301992 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd506ae-baef-4cae-81c3-04882bf64064" containerName="extract-content" Oct 07 20:05:45 crc kubenswrapper[4825]: I1007 20:05:45.302172 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd506ae-baef-4cae-81c3-04882bf64064" containerName="registry-server" Oct 07 20:05:45 crc kubenswrapper[4825]: I1007 20:05:45.303093 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m2kj9/must-gather-6qf5b" Oct 07 20:05:45 crc kubenswrapper[4825]: I1007 20:05:45.311754 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-m2kj9"/"openshift-service-ca.crt" Oct 07 20:05:45 crc kubenswrapper[4825]: I1007 20:05:45.311780 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-m2kj9"/"kube-root-ca.crt" Oct 07 20:05:45 crc kubenswrapper[4825]: I1007 20:05:45.313398 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-m2kj9"/"default-dockercfg-vszqt" Oct 07 20:05:45 crc kubenswrapper[4825]: I1007 20:05:45.314189 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-m2kj9/must-gather-6qf5b"] Oct 07 20:05:45 crc kubenswrapper[4825]: I1007 20:05:45.485299 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5-must-gather-output\") pod \"must-gather-6qf5b\" (UID: \"1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5\") " pod="openshift-must-gather-m2kj9/must-gather-6qf5b" Oct 07 20:05:45 crc kubenswrapper[4825]: I1007 20:05:45.485460 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw45b\" (UniqueName: \"kubernetes.io/projected/1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5-kube-api-access-sw45b\") pod \"must-gather-6qf5b\" (UID: \"1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5\") " pod="openshift-must-gather-m2kj9/must-gather-6qf5b" Oct 07 20:05:45 crc kubenswrapper[4825]: I1007 20:05:45.588481 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5-must-gather-output\") pod \"must-gather-6qf5b\" (UID: \"1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5\") " pod="openshift-must-gather-m2kj9/must-gather-6qf5b" Oct 07 20:05:45 crc kubenswrapper[4825]: I1007 20:05:45.588619 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw45b\" (UniqueName: \"kubernetes.io/projected/1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5-kube-api-access-sw45b\") pod \"must-gather-6qf5b\" (UID: \"1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5\") " pod="openshift-must-gather-m2kj9/must-gather-6qf5b" Oct 07 20:05:45 crc kubenswrapper[4825]: I1007 20:05:45.589216 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5-must-gather-output\") pod \"must-gather-6qf5b\" (UID: \"1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5\") " pod="openshift-must-gather-m2kj9/must-gather-6qf5b" Oct 07 20:05:45 crc kubenswrapper[4825]: I1007 20:05:45.612441 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw45b\" (UniqueName: \"kubernetes.io/projected/1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5-kube-api-access-sw45b\") pod \"must-gather-6qf5b\" (UID: \"1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5\") " pod="openshift-must-gather-m2kj9/must-gather-6qf5b" Oct 07 20:05:45 crc kubenswrapper[4825]: I1007 20:05:45.625643 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m2kj9/must-gather-6qf5b" Oct 07 20:05:46 crc kubenswrapper[4825]: I1007 20:05:46.124722 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-m2kj9/must-gather-6qf5b"] Oct 07 20:05:46 crc kubenswrapper[4825]: I1007 20:05:46.763196 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m2kj9/must-gather-6qf5b" event={"ID":"1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5","Type":"ContainerStarted","Data":"9b7d170be11a83364ec9bd760fca9b0b22c04b42648d75b84ed0d83a9718b7e6"} Oct 07 20:05:46 crc kubenswrapper[4825]: I1007 20:05:46.763540 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m2kj9/must-gather-6qf5b" event={"ID":"1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5","Type":"ContainerStarted","Data":"ad17529029b0a269b0b45e288592a2807c71db94fc1c7a4bc57c45e5243d4e51"} Oct 07 20:05:46 crc kubenswrapper[4825]: I1007 20:05:46.763556 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m2kj9/must-gather-6qf5b" event={"ID":"1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5","Type":"ContainerStarted","Data":"2c627fc88ded1b3bc1e8bfd7f9330a5794f3dcb26cb85139e988e9f31230227d"} Oct 07 20:05:46 crc kubenswrapper[4825]: I1007 20:05:46.780719 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-m2kj9/must-gather-6qf5b" podStartSLOduration=1.780691976 podStartE2EDuration="1.780691976s" podCreationTimestamp="2025-10-07 20:05:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 20:05:46.775840072 +0000 UTC m=+3935.597878749" watchObservedRunningTime="2025-10-07 20:05:46.780691976 +0000 UTC m=+3935.602730653" Oct 07 20:05:50 crc kubenswrapper[4825]: I1007 20:05:50.102690 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-m2kj9/crc-debug-mttcq"] Oct 07 20:05:50 crc kubenswrapper[4825]: I1007 20:05:50.104537 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m2kj9/crc-debug-mttcq" Oct 07 20:05:50 crc kubenswrapper[4825]: I1007 20:05:50.267568 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp6lr\" (UniqueName: \"kubernetes.io/projected/832ab5bf-7796-4bbf-9eef-d71ed33702e9-kube-api-access-lp6lr\") pod \"crc-debug-mttcq\" (UID: \"832ab5bf-7796-4bbf-9eef-d71ed33702e9\") " pod="openshift-must-gather-m2kj9/crc-debug-mttcq" Oct 07 20:05:50 crc kubenswrapper[4825]: I1007 20:05:50.267733 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/832ab5bf-7796-4bbf-9eef-d71ed33702e9-host\") pod \"crc-debug-mttcq\" (UID: \"832ab5bf-7796-4bbf-9eef-d71ed33702e9\") " pod="openshift-must-gather-m2kj9/crc-debug-mttcq" Oct 07 20:05:50 crc kubenswrapper[4825]: I1007 20:05:50.369899 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp6lr\" (UniqueName: \"kubernetes.io/projected/832ab5bf-7796-4bbf-9eef-d71ed33702e9-kube-api-access-lp6lr\") pod \"crc-debug-mttcq\" (UID: \"832ab5bf-7796-4bbf-9eef-d71ed33702e9\") " pod="openshift-must-gather-m2kj9/crc-debug-mttcq" Oct 07 20:05:50 crc kubenswrapper[4825]: I1007 20:05:50.369952 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/832ab5bf-7796-4bbf-9eef-d71ed33702e9-host\") pod \"crc-debug-mttcq\" (UID: \"832ab5bf-7796-4bbf-9eef-d71ed33702e9\") " pod="openshift-must-gather-m2kj9/crc-debug-mttcq" Oct 07 20:05:50 crc kubenswrapper[4825]: I1007 20:05:50.369985 4825 scope.go:117] "RemoveContainer" containerID="e75e895c4653c331d41e03141d836ddd6a18780952b183acc372378bc9bf3ce1" Oct 07 20:05:50 crc kubenswrapper[4825]: I1007 20:05:50.370047 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/832ab5bf-7796-4bbf-9eef-d71ed33702e9-host\") pod \"crc-debug-mttcq\" (UID: \"832ab5bf-7796-4bbf-9eef-d71ed33702e9\") " pod="openshift-must-gather-m2kj9/crc-debug-mttcq" Oct 07 20:05:50 crc kubenswrapper[4825]: I1007 20:05:50.400589 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp6lr\" (UniqueName: \"kubernetes.io/projected/832ab5bf-7796-4bbf-9eef-d71ed33702e9-kube-api-access-lp6lr\") pod \"crc-debug-mttcq\" (UID: \"832ab5bf-7796-4bbf-9eef-d71ed33702e9\") " pod="openshift-must-gather-m2kj9/crc-debug-mttcq" Oct 07 20:05:50 crc kubenswrapper[4825]: I1007 20:05:50.440633 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m2kj9/crc-debug-mttcq" Oct 07 20:05:50 crc kubenswrapper[4825]: W1007 20:05:50.538527 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod832ab5bf_7796_4bbf_9eef_d71ed33702e9.slice/crio-c7382dbf187b9e569736a6145960b24df32053b9bc49e240eb6211e72011f3cd WatchSource:0}: Error finding container c7382dbf187b9e569736a6145960b24df32053b9bc49e240eb6211e72011f3cd: Status 404 returned error can't find the container with id c7382dbf187b9e569736a6145960b24df32053b9bc49e240eb6211e72011f3cd Oct 07 20:05:50 crc kubenswrapper[4825]: I1007 20:05:50.807737 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m2kj9/crc-debug-mttcq" event={"ID":"832ab5bf-7796-4bbf-9eef-d71ed33702e9","Type":"ContainerStarted","Data":"7d6d95a4abf7b430388e75733ee4afbab8d5320a85c38548d5a3c8fe89a23220"} Oct 07 20:05:50 crc kubenswrapper[4825]: I1007 20:05:50.808056 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m2kj9/crc-debug-mttcq" event={"ID":"832ab5bf-7796-4bbf-9eef-d71ed33702e9","Type":"ContainerStarted","Data":"c7382dbf187b9e569736a6145960b24df32053b9bc49e240eb6211e72011f3cd"} Oct 07 20:05:50 crc kubenswrapper[4825]: I1007 20:05:50.845980 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-m2kj9/crc-debug-mttcq" podStartSLOduration=0.845961091 podStartE2EDuration="845.961091ms" podCreationTimestamp="2025-10-07 20:05:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 20:05:50.824265052 +0000 UTC m=+3939.646303689" watchObservedRunningTime="2025-10-07 20:05:50.845961091 +0000 UTC m=+3939.667999728" Oct 07 20:05:57 crc kubenswrapper[4825]: I1007 20:05:57.795619 4825 scope.go:117] "RemoveContainer" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" Oct 07 20:05:57 crc kubenswrapper[4825]: E1007 20:05:57.797252 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 20:06:10 crc kubenswrapper[4825]: I1007 20:06:10.785133 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cccp5"] Oct 07 20:06:10 crc kubenswrapper[4825]: I1007 20:06:10.787710 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cccp5" Oct 07 20:06:10 crc kubenswrapper[4825]: I1007 20:06:10.806182 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cccp5"] Oct 07 20:06:10 crc kubenswrapper[4825]: I1007 20:06:10.851338 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95e42bd1-2d47-4922-92e3-68080e356050-catalog-content\") pod \"community-operators-cccp5\" (UID: \"95e42bd1-2d47-4922-92e3-68080e356050\") " pod="openshift-marketplace/community-operators-cccp5" Oct 07 20:06:10 crc kubenswrapper[4825]: I1007 20:06:10.851512 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95e42bd1-2d47-4922-92e3-68080e356050-utilities\") pod \"community-operators-cccp5\" (UID: \"95e42bd1-2d47-4922-92e3-68080e356050\") " pod="openshift-marketplace/community-operators-cccp5" Oct 07 20:06:10 crc kubenswrapper[4825]: I1007 20:06:10.851585 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-756dd\" (UniqueName: \"kubernetes.io/projected/95e42bd1-2d47-4922-92e3-68080e356050-kube-api-access-756dd\") pod \"community-operators-cccp5\" (UID: \"95e42bd1-2d47-4922-92e3-68080e356050\") " pod="openshift-marketplace/community-operators-cccp5" Oct 07 20:06:10 crc kubenswrapper[4825]: I1007 20:06:10.953258 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95e42bd1-2d47-4922-92e3-68080e356050-catalog-content\") pod \"community-operators-cccp5\" (UID: \"95e42bd1-2d47-4922-92e3-68080e356050\") " pod="openshift-marketplace/community-operators-cccp5" Oct 07 20:06:10 crc kubenswrapper[4825]: I1007 20:06:10.953580 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95e42bd1-2d47-4922-92e3-68080e356050-utilities\") pod \"community-operators-cccp5\" (UID: \"95e42bd1-2d47-4922-92e3-68080e356050\") " pod="openshift-marketplace/community-operators-cccp5" Oct 07 20:06:10 crc kubenswrapper[4825]: I1007 20:06:10.953637 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-756dd\" (UniqueName: \"kubernetes.io/projected/95e42bd1-2d47-4922-92e3-68080e356050-kube-api-access-756dd\") pod \"community-operators-cccp5\" (UID: \"95e42bd1-2d47-4922-92e3-68080e356050\") " pod="openshift-marketplace/community-operators-cccp5" Oct 07 20:06:10 crc kubenswrapper[4825]: I1007 20:06:10.954501 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95e42bd1-2d47-4922-92e3-68080e356050-catalog-content\") pod \"community-operators-cccp5\" (UID: \"95e42bd1-2d47-4922-92e3-68080e356050\") " pod="openshift-marketplace/community-operators-cccp5" Oct 07 20:06:10 crc kubenswrapper[4825]: I1007 20:06:10.954782 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95e42bd1-2d47-4922-92e3-68080e356050-utilities\") pod \"community-operators-cccp5\" (UID: \"95e42bd1-2d47-4922-92e3-68080e356050\") " pod="openshift-marketplace/community-operators-cccp5" Oct 07 20:06:10 crc kubenswrapper[4825]: I1007 20:06:10.981128 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-756dd\" (UniqueName: \"kubernetes.io/projected/95e42bd1-2d47-4922-92e3-68080e356050-kube-api-access-756dd\") pod \"community-operators-cccp5\" (UID: \"95e42bd1-2d47-4922-92e3-68080e356050\") " pod="openshift-marketplace/community-operators-cccp5" Oct 07 20:06:11 crc kubenswrapper[4825]: I1007 20:06:11.104755 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cccp5" Oct 07 20:06:11 crc kubenswrapper[4825]: I1007 20:06:11.741746 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cccp5"] Oct 07 20:06:11 crc kubenswrapper[4825]: W1007 20:06:11.744942 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95e42bd1_2d47_4922_92e3_68080e356050.slice/crio-cdba41670f4c2d3feb16b29d1111d97a1c81b92a351e5b0085c11e61cf74267e WatchSource:0}: Error finding container cdba41670f4c2d3feb16b29d1111d97a1c81b92a351e5b0085c11e61cf74267e: Status 404 returned error can't find the container with id cdba41670f4c2d3feb16b29d1111d97a1c81b92a351e5b0085c11e61cf74267e Oct 07 20:06:12 crc kubenswrapper[4825]: I1007 20:06:12.017176 4825 generic.go:334] "Generic (PLEG): container finished" podID="95e42bd1-2d47-4922-92e3-68080e356050" containerID="9dee347b06015a366c3ec08482d412a83abd1c6bb74d7e86375262db22ee5ab7" exitCode=0 Oct 07 20:06:12 crc kubenswrapper[4825]: I1007 20:06:12.017218 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cccp5" event={"ID":"95e42bd1-2d47-4922-92e3-68080e356050","Type":"ContainerDied","Data":"9dee347b06015a366c3ec08482d412a83abd1c6bb74d7e86375262db22ee5ab7"} Oct 07 20:06:12 crc kubenswrapper[4825]: I1007 20:06:12.017261 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cccp5" event={"ID":"95e42bd1-2d47-4922-92e3-68080e356050","Type":"ContainerStarted","Data":"cdba41670f4c2d3feb16b29d1111d97a1c81b92a351e5b0085c11e61cf74267e"} Oct 07 20:06:12 crc kubenswrapper[4825]: I1007 20:06:12.795282 4825 scope.go:117] "RemoveContainer" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" Oct 07 20:06:12 crc kubenswrapper[4825]: E1007 20:06:12.795983 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 20:06:13 crc kubenswrapper[4825]: I1007 20:06:13.025208 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cccp5" event={"ID":"95e42bd1-2d47-4922-92e3-68080e356050","Type":"ContainerStarted","Data":"c9d13c9fd5a6a19453524ae2c3f106dfb6252dfdd803cfd28ea8956b2f2b8da2"} Oct 07 20:06:14 crc kubenswrapper[4825]: I1007 20:06:14.036064 4825 generic.go:334] "Generic (PLEG): container finished" podID="95e42bd1-2d47-4922-92e3-68080e356050" containerID="c9d13c9fd5a6a19453524ae2c3f106dfb6252dfdd803cfd28ea8956b2f2b8da2" exitCode=0 Oct 07 20:06:14 crc kubenswrapper[4825]: I1007 20:06:14.036233 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cccp5" event={"ID":"95e42bd1-2d47-4922-92e3-68080e356050","Type":"ContainerDied","Data":"c9d13c9fd5a6a19453524ae2c3f106dfb6252dfdd803cfd28ea8956b2f2b8da2"} Oct 07 20:06:15 crc kubenswrapper[4825]: I1007 20:06:15.056462 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cccp5" event={"ID":"95e42bd1-2d47-4922-92e3-68080e356050","Type":"ContainerStarted","Data":"558f2f50b53afcfb51bd2e2bc71f4fc63b0de3b43969ea3ffb47d9f6dced762f"} Oct 07 20:06:15 crc kubenswrapper[4825]: I1007 20:06:15.080646 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cccp5" podStartSLOduration=2.368778621 podStartE2EDuration="5.080624991s" podCreationTimestamp="2025-10-07 20:06:10 +0000 UTC" firstStartedPulling="2025-10-07 20:06:12.019081923 +0000 UTC m=+3960.841120560" lastFinishedPulling="2025-10-07 20:06:14.730928293 +0000 UTC m=+3963.552966930" observedRunningTime="2025-10-07 20:06:15.073679832 +0000 UTC m=+3963.895718469" watchObservedRunningTime="2025-10-07 20:06:15.080624991 +0000 UTC m=+3963.902663638" Oct 07 20:06:18 crc kubenswrapper[4825]: I1007 20:06:18.175572 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kqh2f"] Oct 07 20:06:18 crc kubenswrapper[4825]: I1007 20:06:18.178224 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqh2f" Oct 07 20:06:18 crc kubenswrapper[4825]: I1007 20:06:18.194165 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqh2f"] Oct 07 20:06:18 crc kubenswrapper[4825]: I1007 20:06:18.298253 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e104a0-6878-460b-9420-f2dc01d20e7f-catalog-content\") pod \"redhat-marketplace-kqh2f\" (UID: \"50e104a0-6878-460b-9420-f2dc01d20e7f\") " pod="openshift-marketplace/redhat-marketplace-kqh2f" Oct 07 20:06:18 crc kubenswrapper[4825]: I1007 20:06:18.298367 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e104a0-6878-460b-9420-f2dc01d20e7f-utilities\") pod \"redhat-marketplace-kqh2f\" (UID: \"50e104a0-6878-460b-9420-f2dc01d20e7f\") " pod="openshift-marketplace/redhat-marketplace-kqh2f" Oct 07 20:06:18 crc kubenswrapper[4825]: I1007 20:06:18.298432 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft6wr\" (UniqueName: \"kubernetes.io/projected/50e104a0-6878-460b-9420-f2dc01d20e7f-kube-api-access-ft6wr\") pod \"redhat-marketplace-kqh2f\" (UID: \"50e104a0-6878-460b-9420-f2dc01d20e7f\") " pod="openshift-marketplace/redhat-marketplace-kqh2f" Oct 07 20:06:18 crc kubenswrapper[4825]: I1007 20:06:18.399549 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e104a0-6878-460b-9420-f2dc01d20e7f-catalog-content\") pod \"redhat-marketplace-kqh2f\" (UID: \"50e104a0-6878-460b-9420-f2dc01d20e7f\") " pod="openshift-marketplace/redhat-marketplace-kqh2f" Oct 07 20:06:18 crc kubenswrapper[4825]: I1007 20:06:18.399659 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e104a0-6878-460b-9420-f2dc01d20e7f-utilities\") pod \"redhat-marketplace-kqh2f\" (UID: \"50e104a0-6878-460b-9420-f2dc01d20e7f\") " pod="openshift-marketplace/redhat-marketplace-kqh2f" Oct 07 20:06:18 crc kubenswrapper[4825]: I1007 20:06:18.399718 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft6wr\" (UniqueName: \"kubernetes.io/projected/50e104a0-6878-460b-9420-f2dc01d20e7f-kube-api-access-ft6wr\") pod \"redhat-marketplace-kqh2f\" (UID: \"50e104a0-6878-460b-9420-f2dc01d20e7f\") " pod="openshift-marketplace/redhat-marketplace-kqh2f" Oct 07 20:06:18 crc kubenswrapper[4825]: I1007 20:06:18.400414 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e104a0-6878-460b-9420-f2dc01d20e7f-catalog-content\") pod \"redhat-marketplace-kqh2f\" (UID: \"50e104a0-6878-460b-9420-f2dc01d20e7f\") " pod="openshift-marketplace/redhat-marketplace-kqh2f" Oct 07 20:06:18 crc kubenswrapper[4825]: I1007 20:06:18.400621 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e104a0-6878-460b-9420-f2dc01d20e7f-utilities\") pod \"redhat-marketplace-kqh2f\" (UID: \"50e104a0-6878-460b-9420-f2dc01d20e7f\") " pod="openshift-marketplace/redhat-marketplace-kqh2f" Oct 07 20:06:18 crc kubenswrapper[4825]: I1007 20:06:18.431964 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft6wr\" (UniqueName: \"kubernetes.io/projected/50e104a0-6878-460b-9420-f2dc01d20e7f-kube-api-access-ft6wr\") pod \"redhat-marketplace-kqh2f\" (UID: \"50e104a0-6878-460b-9420-f2dc01d20e7f\") " pod="openshift-marketplace/redhat-marketplace-kqh2f" Oct 07 20:06:18 crc kubenswrapper[4825]: I1007 20:06:18.507647 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqh2f" Oct 07 20:06:19 crc kubenswrapper[4825]: I1007 20:06:19.039681 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqh2f"] Oct 07 20:06:19 crc kubenswrapper[4825]: I1007 20:06:19.114297 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqh2f" event={"ID":"50e104a0-6878-460b-9420-f2dc01d20e7f","Type":"ContainerStarted","Data":"4b512e504ab365efd04936c905da369696b5509c78907e372b90990085746a53"} Oct 07 20:06:20 crc kubenswrapper[4825]: I1007 20:06:20.126269 4825 generic.go:334] "Generic (PLEG): container finished" podID="50e104a0-6878-460b-9420-f2dc01d20e7f" containerID="3db33f05455c57e1d3f0a0d2815d28bdceeccdeab2f90d42421d306ea2e24e8c" exitCode=0 Oct 07 20:06:20 crc kubenswrapper[4825]: I1007 20:06:20.127099 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqh2f" event={"ID":"50e104a0-6878-460b-9420-f2dc01d20e7f","Type":"ContainerDied","Data":"3db33f05455c57e1d3f0a0d2815d28bdceeccdeab2f90d42421d306ea2e24e8c"} Oct 07 20:06:21 crc kubenswrapper[4825]: I1007 20:06:21.107397 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cccp5" Oct 07 20:06:21 crc kubenswrapper[4825]: I1007 20:06:21.107707 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cccp5" Oct 07 20:06:21 crc kubenswrapper[4825]: I1007 20:06:21.157101 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cccp5" Oct 07 20:06:21 crc kubenswrapper[4825]: I1007 20:06:21.213547 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cccp5" Oct 07 20:06:22 crc kubenswrapper[4825]: I1007 20:06:22.150946 4825 generic.go:334] "Generic (PLEG): container finished" podID="50e104a0-6878-460b-9420-f2dc01d20e7f" containerID="9ac5c7410e510e88a3b3d37c6aa5c05bab5ae0018a16452769ee52127376fdef" exitCode=0 Oct 07 20:06:22 crc kubenswrapper[4825]: I1007 20:06:22.151016 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqh2f" event={"ID":"50e104a0-6878-460b-9420-f2dc01d20e7f","Type":"ContainerDied","Data":"9ac5c7410e510e88a3b3d37c6aa5c05bab5ae0018a16452769ee52127376fdef"} Oct 07 20:06:22 crc kubenswrapper[4825]: I1007 20:06:22.360379 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cccp5"] Oct 07 20:06:23 crc kubenswrapper[4825]: I1007 20:06:23.161650 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cccp5" podUID="95e42bd1-2d47-4922-92e3-68080e356050" containerName="registry-server" containerID="cri-o://558f2f50b53afcfb51bd2e2bc71f4fc63b0de3b43969ea3ffb47d9f6dced762f" gracePeriod=2 Oct 07 20:06:23 crc kubenswrapper[4825]: I1007 20:06:23.162389 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqh2f" event={"ID":"50e104a0-6878-460b-9420-f2dc01d20e7f","Type":"ContainerStarted","Data":"c348d3c676bc8107d7a6e333857db3da587377dbee26a3aac52ed621054c2941"} Oct 07 20:06:23 crc kubenswrapper[4825]: I1007 20:06:23.203034 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kqh2f" podStartSLOduration=2.390922584 podStartE2EDuration="5.203015647s" podCreationTimestamp="2025-10-07 20:06:18 +0000 UTC" firstStartedPulling="2025-10-07 20:06:20.129316724 +0000 UTC m=+3968.951355361" lastFinishedPulling="2025-10-07 20:06:22.941409787 +0000 UTC m=+3971.763448424" observedRunningTime="2025-10-07 20:06:23.190679306 +0000 UTC m=+3972.012717943" watchObservedRunningTime="2025-10-07 20:06:23.203015647 +0000 UTC m=+3972.025054284" Oct 07 20:06:23 crc kubenswrapper[4825]: I1007 20:06:23.588111 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cccp5" Oct 07 20:06:23 crc kubenswrapper[4825]: I1007 20:06:23.687524 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-756dd\" (UniqueName: \"kubernetes.io/projected/95e42bd1-2d47-4922-92e3-68080e356050-kube-api-access-756dd\") pod \"95e42bd1-2d47-4922-92e3-68080e356050\" (UID: \"95e42bd1-2d47-4922-92e3-68080e356050\") " Oct 07 20:06:23 crc kubenswrapper[4825]: I1007 20:06:23.687723 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95e42bd1-2d47-4922-92e3-68080e356050-utilities\") pod \"95e42bd1-2d47-4922-92e3-68080e356050\" (UID: \"95e42bd1-2d47-4922-92e3-68080e356050\") " Oct 07 20:06:23 crc kubenswrapper[4825]: I1007 20:06:23.687859 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95e42bd1-2d47-4922-92e3-68080e356050-catalog-content\") pod \"95e42bd1-2d47-4922-92e3-68080e356050\" (UID: \"95e42bd1-2d47-4922-92e3-68080e356050\") " Oct 07 20:06:23 crc kubenswrapper[4825]: I1007 20:06:23.688583 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95e42bd1-2d47-4922-92e3-68080e356050-utilities" (OuterVolumeSpecName: "utilities") pod "95e42bd1-2d47-4922-92e3-68080e356050" (UID: "95e42bd1-2d47-4922-92e3-68080e356050"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 20:06:23 crc kubenswrapper[4825]: I1007 20:06:23.697365 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95e42bd1-2d47-4922-92e3-68080e356050-kube-api-access-756dd" (OuterVolumeSpecName: "kube-api-access-756dd") pod "95e42bd1-2d47-4922-92e3-68080e356050" (UID: "95e42bd1-2d47-4922-92e3-68080e356050"). InnerVolumeSpecName "kube-api-access-756dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 20:06:23 crc kubenswrapper[4825]: I1007 20:06:23.790511 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95e42bd1-2d47-4922-92e3-68080e356050-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 20:06:23 crc kubenswrapper[4825]: I1007 20:06:23.790545 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-756dd\" (UniqueName: \"kubernetes.io/projected/95e42bd1-2d47-4922-92e3-68080e356050-kube-api-access-756dd\") on node \"crc\" DevicePath \"\"" Oct 07 20:06:23 crc kubenswrapper[4825]: I1007 20:06:23.795207 4825 scope.go:117] "RemoveContainer" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" Oct 07 20:06:23 crc kubenswrapper[4825]: E1007 20:06:23.795535 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 20:06:24 crc kubenswrapper[4825]: I1007 20:06:24.134116 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95e42bd1-2d47-4922-92e3-68080e356050-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95e42bd1-2d47-4922-92e3-68080e356050" (UID: "95e42bd1-2d47-4922-92e3-68080e356050"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 20:06:24 crc kubenswrapper[4825]: I1007 20:06:24.172438 4825 generic.go:334] "Generic (PLEG): container finished" podID="95e42bd1-2d47-4922-92e3-68080e356050" containerID="558f2f50b53afcfb51bd2e2bc71f4fc63b0de3b43969ea3ffb47d9f6dced762f" exitCode=0 Oct 07 20:06:24 crc kubenswrapper[4825]: I1007 20:06:24.173651 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cccp5" Oct 07 20:06:24 crc kubenswrapper[4825]: I1007 20:06:24.176324 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cccp5" event={"ID":"95e42bd1-2d47-4922-92e3-68080e356050","Type":"ContainerDied","Data":"558f2f50b53afcfb51bd2e2bc71f4fc63b0de3b43969ea3ffb47d9f6dced762f"} Oct 07 20:06:24 crc kubenswrapper[4825]: I1007 20:06:24.176364 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cccp5" event={"ID":"95e42bd1-2d47-4922-92e3-68080e356050","Type":"ContainerDied","Data":"cdba41670f4c2d3feb16b29d1111d97a1c81b92a351e5b0085c11e61cf74267e"} Oct 07 20:06:24 crc kubenswrapper[4825]: I1007 20:06:24.176380 4825 scope.go:117] "RemoveContainer" containerID="558f2f50b53afcfb51bd2e2bc71f4fc63b0de3b43969ea3ffb47d9f6dced762f" Oct 07 20:06:24 crc kubenswrapper[4825]: I1007 20:06:24.208543 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95e42bd1-2d47-4922-92e3-68080e356050-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 20:06:24 crc kubenswrapper[4825]: I1007 20:06:24.208736 4825 scope.go:117] "RemoveContainer" containerID="c9d13c9fd5a6a19453524ae2c3f106dfb6252dfdd803cfd28ea8956b2f2b8da2" Oct 07 20:06:24 crc kubenswrapper[4825]: I1007 20:06:24.215968 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cccp5"] Oct 07 20:06:24 crc kubenswrapper[4825]: I1007 20:06:24.223400 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cccp5"] Oct 07 20:06:24 crc kubenswrapper[4825]: I1007 20:06:24.260372 4825 scope.go:117] "RemoveContainer" containerID="9dee347b06015a366c3ec08482d412a83abd1c6bb74d7e86375262db22ee5ab7" Oct 07 20:06:24 crc kubenswrapper[4825]: I1007 20:06:24.315463 4825 scope.go:117] "RemoveContainer" containerID="558f2f50b53afcfb51bd2e2bc71f4fc63b0de3b43969ea3ffb47d9f6dced762f" Oct 07 20:06:24 crc kubenswrapper[4825]: E1007 20:06:24.315852 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"558f2f50b53afcfb51bd2e2bc71f4fc63b0de3b43969ea3ffb47d9f6dced762f\": container with ID starting with 558f2f50b53afcfb51bd2e2bc71f4fc63b0de3b43969ea3ffb47d9f6dced762f not found: ID does not exist" containerID="558f2f50b53afcfb51bd2e2bc71f4fc63b0de3b43969ea3ffb47d9f6dced762f" Oct 07 20:06:24 crc kubenswrapper[4825]: I1007 20:06:24.315882 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"558f2f50b53afcfb51bd2e2bc71f4fc63b0de3b43969ea3ffb47d9f6dced762f"} err="failed to get container status \"558f2f50b53afcfb51bd2e2bc71f4fc63b0de3b43969ea3ffb47d9f6dced762f\": rpc error: code = NotFound desc = could not find container \"558f2f50b53afcfb51bd2e2bc71f4fc63b0de3b43969ea3ffb47d9f6dced762f\": container with ID starting with 558f2f50b53afcfb51bd2e2bc71f4fc63b0de3b43969ea3ffb47d9f6dced762f not found: ID does not exist" Oct 07 20:06:24 crc kubenswrapper[4825]: I1007 20:06:24.315900 4825 scope.go:117] "RemoveContainer" containerID="c9d13c9fd5a6a19453524ae2c3f106dfb6252dfdd803cfd28ea8956b2f2b8da2" Oct 07 20:06:24 crc kubenswrapper[4825]: E1007 20:06:24.316138 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9d13c9fd5a6a19453524ae2c3f106dfb6252dfdd803cfd28ea8956b2f2b8da2\": container with ID starting with c9d13c9fd5a6a19453524ae2c3f106dfb6252dfdd803cfd28ea8956b2f2b8da2 not found: ID does not exist" containerID="c9d13c9fd5a6a19453524ae2c3f106dfb6252dfdd803cfd28ea8956b2f2b8da2" Oct 07 20:06:24 crc kubenswrapper[4825]: I1007 20:06:24.316159 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9d13c9fd5a6a19453524ae2c3f106dfb6252dfdd803cfd28ea8956b2f2b8da2"} err="failed to get container status \"c9d13c9fd5a6a19453524ae2c3f106dfb6252dfdd803cfd28ea8956b2f2b8da2\": rpc error: code = NotFound desc = could not find container \"c9d13c9fd5a6a19453524ae2c3f106dfb6252dfdd803cfd28ea8956b2f2b8da2\": container with ID starting with c9d13c9fd5a6a19453524ae2c3f106dfb6252dfdd803cfd28ea8956b2f2b8da2 not found: ID does not exist" Oct 07 20:06:24 crc kubenswrapper[4825]: I1007 20:06:24.316172 4825 scope.go:117] "RemoveContainer" containerID="9dee347b06015a366c3ec08482d412a83abd1c6bb74d7e86375262db22ee5ab7" Oct 07 20:06:24 crc kubenswrapper[4825]: E1007 20:06:24.316675 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dee347b06015a366c3ec08482d412a83abd1c6bb74d7e86375262db22ee5ab7\": container with ID starting with 9dee347b06015a366c3ec08482d412a83abd1c6bb74d7e86375262db22ee5ab7 not found: ID does not exist" containerID="9dee347b06015a366c3ec08482d412a83abd1c6bb74d7e86375262db22ee5ab7" Oct 07 20:06:24 crc kubenswrapper[4825]: I1007 20:06:24.316697 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dee347b06015a366c3ec08482d412a83abd1c6bb74d7e86375262db22ee5ab7"} err="failed to get container status \"9dee347b06015a366c3ec08482d412a83abd1c6bb74d7e86375262db22ee5ab7\": rpc error: code = NotFound desc = could not find container \"9dee347b06015a366c3ec08482d412a83abd1c6bb74d7e86375262db22ee5ab7\": container with ID starting with 9dee347b06015a366c3ec08482d412a83abd1c6bb74d7e86375262db22ee5ab7 not found: ID does not exist" Oct 07 20:06:25 crc kubenswrapper[4825]: I1007 20:06:25.804951 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95e42bd1-2d47-4922-92e3-68080e356050" path="/var/lib/kubelet/pods/95e42bd1-2d47-4922-92e3-68080e356050/volumes" Oct 07 20:06:28 crc kubenswrapper[4825]: I1007 20:06:28.508524 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kqh2f" Oct 07 20:06:28 crc kubenswrapper[4825]: I1007 20:06:28.509034 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kqh2f" Oct 07 20:06:28 crc kubenswrapper[4825]: I1007 20:06:28.566219 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kqh2f" Oct 07 20:06:29 crc kubenswrapper[4825]: I1007 20:06:29.294509 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kqh2f" Oct 07 20:06:29 crc kubenswrapper[4825]: I1007 20:06:29.342628 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqh2f"] Oct 07 20:06:31 crc kubenswrapper[4825]: I1007 20:06:31.249177 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kqh2f" podUID="50e104a0-6878-460b-9420-f2dc01d20e7f" containerName="registry-server" containerID="cri-o://c348d3c676bc8107d7a6e333857db3da587377dbee26a3aac52ed621054c2941" gracePeriod=2 Oct 07 20:06:31 crc kubenswrapper[4825]: I1007 20:06:31.732147 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqh2f" Oct 07 20:06:31 crc kubenswrapper[4825]: I1007 20:06:31.860781 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft6wr\" (UniqueName: \"kubernetes.io/projected/50e104a0-6878-460b-9420-f2dc01d20e7f-kube-api-access-ft6wr\") pod \"50e104a0-6878-460b-9420-f2dc01d20e7f\" (UID: \"50e104a0-6878-460b-9420-f2dc01d20e7f\") " Oct 07 20:06:31 crc kubenswrapper[4825]: I1007 20:06:31.861043 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e104a0-6878-460b-9420-f2dc01d20e7f-catalog-content\") pod \"50e104a0-6878-460b-9420-f2dc01d20e7f\" (UID: \"50e104a0-6878-460b-9420-f2dc01d20e7f\") " Oct 07 20:06:31 crc kubenswrapper[4825]: I1007 20:06:31.861172 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e104a0-6878-460b-9420-f2dc01d20e7f-utilities\") pod \"50e104a0-6878-460b-9420-f2dc01d20e7f\" (UID: \"50e104a0-6878-460b-9420-f2dc01d20e7f\") " Oct 07 20:06:31 crc kubenswrapper[4825]: I1007 20:06:31.862014 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50e104a0-6878-460b-9420-f2dc01d20e7f-utilities" (OuterVolumeSpecName: "utilities") pod "50e104a0-6878-460b-9420-f2dc01d20e7f" (UID: "50e104a0-6878-460b-9420-f2dc01d20e7f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 20:06:31 crc kubenswrapper[4825]: I1007 20:06:31.877538 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50e104a0-6878-460b-9420-f2dc01d20e7f-kube-api-access-ft6wr" (OuterVolumeSpecName: "kube-api-access-ft6wr") pod "50e104a0-6878-460b-9420-f2dc01d20e7f" (UID: "50e104a0-6878-460b-9420-f2dc01d20e7f"). InnerVolumeSpecName "kube-api-access-ft6wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 20:06:31 crc kubenswrapper[4825]: I1007 20:06:31.887377 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50e104a0-6878-460b-9420-f2dc01d20e7f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50e104a0-6878-460b-9420-f2dc01d20e7f" (UID: "50e104a0-6878-460b-9420-f2dc01d20e7f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 20:06:31 crc kubenswrapper[4825]: I1007 20:06:31.963467 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e104a0-6878-460b-9420-f2dc01d20e7f-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 20:06:31 crc kubenswrapper[4825]: I1007 20:06:31.963541 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft6wr\" (UniqueName: \"kubernetes.io/projected/50e104a0-6878-460b-9420-f2dc01d20e7f-kube-api-access-ft6wr\") on node \"crc\" DevicePath \"\"" Oct 07 20:06:31 crc kubenswrapper[4825]: I1007 20:06:31.963571 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e104a0-6878-460b-9420-f2dc01d20e7f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 20:06:32 crc kubenswrapper[4825]: I1007 20:06:32.262419 4825 generic.go:334] "Generic (PLEG): container finished" podID="50e104a0-6878-460b-9420-f2dc01d20e7f" containerID="c348d3c676bc8107d7a6e333857db3da587377dbee26a3aac52ed621054c2941" exitCode=0 Oct 07 20:06:32 crc kubenswrapper[4825]: I1007 20:06:32.262459 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqh2f" event={"ID":"50e104a0-6878-460b-9420-f2dc01d20e7f","Type":"ContainerDied","Data":"c348d3c676bc8107d7a6e333857db3da587377dbee26a3aac52ed621054c2941"} Oct 07 20:06:32 crc kubenswrapper[4825]: I1007 20:06:32.262485 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqh2f" event={"ID":"50e104a0-6878-460b-9420-f2dc01d20e7f","Type":"ContainerDied","Data":"4b512e504ab365efd04936c905da369696b5509c78907e372b90990085746a53"} Oct 07 20:06:32 crc kubenswrapper[4825]: I1007 20:06:32.262503 4825 scope.go:117] "RemoveContainer" containerID="c348d3c676bc8107d7a6e333857db3da587377dbee26a3aac52ed621054c2941" Oct 07 20:06:32 crc kubenswrapper[4825]: I1007 20:06:32.262631 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqh2f" Oct 07 20:06:32 crc kubenswrapper[4825]: I1007 20:06:32.316179 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqh2f"] Oct 07 20:06:32 crc kubenswrapper[4825]: I1007 20:06:32.317528 4825 scope.go:117] "RemoveContainer" containerID="9ac5c7410e510e88a3b3d37c6aa5c05bab5ae0018a16452769ee52127376fdef" Oct 07 20:06:32 crc kubenswrapper[4825]: I1007 20:06:32.347246 4825 scope.go:117] "RemoveContainer" containerID="3db33f05455c57e1d3f0a0d2815d28bdceeccdeab2f90d42421d306ea2e24e8c" Oct 07 20:06:32 crc kubenswrapper[4825]: I1007 20:06:32.349651 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqh2f"] Oct 07 20:06:32 crc kubenswrapper[4825]: I1007 20:06:32.398397 4825 scope.go:117] "RemoveContainer" containerID="c348d3c676bc8107d7a6e333857db3da587377dbee26a3aac52ed621054c2941" Oct 07 20:06:32 crc kubenswrapper[4825]: E1007 20:06:32.401396 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c348d3c676bc8107d7a6e333857db3da587377dbee26a3aac52ed621054c2941\": container with ID starting with c348d3c676bc8107d7a6e333857db3da587377dbee26a3aac52ed621054c2941 not found: ID does not exist" containerID="c348d3c676bc8107d7a6e333857db3da587377dbee26a3aac52ed621054c2941" Oct 07 20:06:32 crc kubenswrapper[4825]: I1007 20:06:32.401436 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c348d3c676bc8107d7a6e333857db3da587377dbee26a3aac52ed621054c2941"} err="failed to get container status \"c348d3c676bc8107d7a6e333857db3da587377dbee26a3aac52ed621054c2941\": rpc error: code = NotFound desc = could not find container \"c348d3c676bc8107d7a6e333857db3da587377dbee26a3aac52ed621054c2941\": container with ID starting with c348d3c676bc8107d7a6e333857db3da587377dbee26a3aac52ed621054c2941 not found: ID does not exist" Oct 07 20:06:32 crc kubenswrapper[4825]: I1007 20:06:32.401463 4825 scope.go:117] "RemoveContainer" containerID="9ac5c7410e510e88a3b3d37c6aa5c05bab5ae0018a16452769ee52127376fdef" Oct 07 20:06:32 crc kubenswrapper[4825]: E1007 20:06:32.401929 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ac5c7410e510e88a3b3d37c6aa5c05bab5ae0018a16452769ee52127376fdef\": container with ID starting with 9ac5c7410e510e88a3b3d37c6aa5c05bab5ae0018a16452769ee52127376fdef not found: ID does not exist" containerID="9ac5c7410e510e88a3b3d37c6aa5c05bab5ae0018a16452769ee52127376fdef" Oct 07 20:06:32 crc kubenswrapper[4825]: I1007 20:06:32.401971 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ac5c7410e510e88a3b3d37c6aa5c05bab5ae0018a16452769ee52127376fdef"} err="failed to get container status \"9ac5c7410e510e88a3b3d37c6aa5c05bab5ae0018a16452769ee52127376fdef\": rpc error: code = NotFound desc = could not find container \"9ac5c7410e510e88a3b3d37c6aa5c05bab5ae0018a16452769ee52127376fdef\": container with ID starting with 9ac5c7410e510e88a3b3d37c6aa5c05bab5ae0018a16452769ee52127376fdef not found: ID does not exist" Oct 07 20:06:32 crc kubenswrapper[4825]: I1007 20:06:32.401998 4825 scope.go:117] "RemoveContainer" containerID="3db33f05455c57e1d3f0a0d2815d28bdceeccdeab2f90d42421d306ea2e24e8c" Oct 07 20:06:32 crc kubenswrapper[4825]: E1007 20:06:32.402253 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3db33f05455c57e1d3f0a0d2815d28bdceeccdeab2f90d42421d306ea2e24e8c\": container with ID starting with 3db33f05455c57e1d3f0a0d2815d28bdceeccdeab2f90d42421d306ea2e24e8c not found: ID does not exist" containerID="3db33f05455c57e1d3f0a0d2815d28bdceeccdeab2f90d42421d306ea2e24e8c" Oct 07 20:06:32 crc kubenswrapper[4825]: I1007 20:06:32.402273 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3db33f05455c57e1d3f0a0d2815d28bdceeccdeab2f90d42421d306ea2e24e8c"} err="failed to get container status \"3db33f05455c57e1d3f0a0d2815d28bdceeccdeab2f90d42421d306ea2e24e8c\": rpc error: code = NotFound desc = could not find container \"3db33f05455c57e1d3f0a0d2815d28bdceeccdeab2f90d42421d306ea2e24e8c\": container with ID starting with 3db33f05455c57e1d3f0a0d2815d28bdceeccdeab2f90d42421d306ea2e24e8c not found: ID does not exist" Oct 07 20:06:33 crc kubenswrapper[4825]: I1007 20:06:33.806170 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50e104a0-6878-460b-9420-f2dc01d20e7f" path="/var/lib/kubelet/pods/50e104a0-6878-460b-9420-f2dc01d20e7f/volumes" Oct 07 20:06:35 crc kubenswrapper[4825]: I1007 20:06:35.795809 4825 scope.go:117] "RemoveContainer" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" Oct 07 20:06:35 crc kubenswrapper[4825]: E1007 20:06:35.796369 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 20:06:49 crc kubenswrapper[4825]: I1007 20:06:49.794960 4825 scope.go:117] "RemoveContainer" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" Oct 07 20:06:49 crc kubenswrapper[4825]: E1007 20:06:49.795814 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 20:06:53 crc kubenswrapper[4825]: I1007 20:06:53.098523 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-657f997574-lnlbm_dc023b5f-d12b-4ce6-9cc6-1bac1fa48455/barbican-api/0.log" Oct 07 20:06:53 crc kubenswrapper[4825]: I1007 20:06:53.163469 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-657f997574-lnlbm_dc023b5f-d12b-4ce6-9cc6-1bac1fa48455/barbican-api-log/0.log" Oct 07 20:06:53 crc kubenswrapper[4825]: I1007 20:06:53.355088 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-594d76bc86-m9c6m_ea2e502f-f902-43be-989a-2f0ed4e3ae02/barbican-keystone-listener/0.log" Oct 07 20:06:53 crc kubenswrapper[4825]: I1007 20:06:53.413995 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-594d76bc86-m9c6m_ea2e502f-f902-43be-989a-2f0ed4e3ae02/barbican-keystone-listener-log/0.log" Oct 07 20:06:53 crc kubenswrapper[4825]: I1007 20:06:53.534648 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-664844745f-dvzxg_56d9e279-5942-4a24-84db-5d7f8fcabcba/barbican-worker/0.log" Oct 07 20:06:53 crc kubenswrapper[4825]: I1007 20:06:53.787177 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-664844745f-dvzxg_56d9e279-5942-4a24-84db-5d7f8fcabcba/barbican-worker-log/0.log" Oct 07 20:06:53 crc kubenswrapper[4825]: I1007 20:06:53.933615 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-xlbc2_ede848ae-130b-4c5c-a4fb-873d9ea65cb6/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:06:54 crc kubenswrapper[4825]: I1007 20:06:54.173796 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5553aae5-6efa-4d21-bbb7-f2c0f23071b3/ceilometer-central-agent/0.log" Oct 07 20:06:54 crc kubenswrapper[4825]: I1007 20:06:54.218727 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5553aae5-6efa-4d21-bbb7-f2c0f23071b3/ceilometer-notification-agent/0.log" Oct 07 20:06:54 crc kubenswrapper[4825]: I1007 20:06:54.247640 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5553aae5-6efa-4d21-bbb7-f2c0f23071b3/proxy-httpd/0.log" Oct 07 20:06:54 crc kubenswrapper[4825]: I1007 20:06:54.355904 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5553aae5-6efa-4d21-bbb7-f2c0f23071b3/sg-core/0.log" Oct 07 20:06:54 crc kubenswrapper[4825]: I1007 20:06:54.499952 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_598ea581-8e2b-47f6-8360-3907ab4c3f49/cinder-api/0.log" Oct 07 20:06:54 crc kubenswrapper[4825]: I1007 20:06:54.584993 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_598ea581-8e2b-47f6-8360-3907ab4c3f49/cinder-api-log/0.log" Oct 07 20:06:54 crc kubenswrapper[4825]: I1007 20:06:54.720566 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_06faa8d0-8aff-4422-a2cb-8643f6e920a8/cinder-scheduler/0.log" Oct 07 20:06:54 crc kubenswrapper[4825]: I1007 20:06:54.818829 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_06faa8d0-8aff-4422-a2cb-8643f6e920a8/probe/0.log" Oct 07 20:06:54 crc kubenswrapper[4825]: I1007 20:06:54.937188 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-fqxnl_ad194ba9-9675-4a8e-be19-b44964a5b493/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:06:55 crc kubenswrapper[4825]: I1007 20:06:55.170715 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-rhx6z_26532318-7138-4557-9814-febc4ba75fb8/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:06:55 crc kubenswrapper[4825]: I1007 20:06:55.259244 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-vtxwn_da7f120f-3b67-4abe-a9b9-c2d2b5ce6b0c/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:06:55 crc kubenswrapper[4825]: I1007 20:06:55.419332 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-gkpsf_4dbb4b22-9fab-40a8-8fee-1d77e4e37c80/init/0.log" Oct 07 20:06:55 crc kubenswrapper[4825]: I1007 20:06:55.598060 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-gkpsf_4dbb4b22-9fab-40a8-8fee-1d77e4e37c80/init/0.log" Oct 07 20:06:55 crc kubenswrapper[4825]: I1007 20:06:55.640938 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-gkpsf_4dbb4b22-9fab-40a8-8fee-1d77e4e37c80/dnsmasq-dns/0.log" Oct 07 20:06:55 crc kubenswrapper[4825]: I1007 20:06:55.820880 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-ngshx_35f68c5c-870d-448d-a680-decef3790f6b/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:06:55 crc kubenswrapper[4825]: I1007 20:06:55.853214 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_1356ee9f-f727-42b6-9a53-f80e78720704/glance-httpd/0.log" Oct 07 20:06:56 crc kubenswrapper[4825]: I1007 20:06:56.038583 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_1356ee9f-f727-42b6-9a53-f80e78720704/glance-log/0.log" Oct 07 20:06:56 crc kubenswrapper[4825]: I1007 20:06:56.086284 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_747f4079-112d-4889-937f-fc39c9d75819/glance-httpd/0.log" Oct 07 20:06:56 crc kubenswrapper[4825]: I1007 20:06:56.194992 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_747f4079-112d-4889-937f-fc39c9d75819/glance-log/0.log" Oct 07 20:06:56 crc kubenswrapper[4825]: I1007 20:06:56.406071 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-58d7dd5b56-nhlgz_710a139f-bf12-4021-b702-3e40d49febf1/horizon/0.log" Oct 07 20:06:56 crc kubenswrapper[4825]: I1007 20:06:56.548166 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-qz7pr_e3a03ee0-54d8-44d6-94fb-59a5bbed04fd/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:06:56 crc kubenswrapper[4825]: I1007 20:06:56.648912 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-58d7dd5b56-nhlgz_710a139f-bf12-4021-b702-3e40d49febf1/horizon-log/0.log" Oct 07 20:06:56 crc kubenswrapper[4825]: I1007 20:06:56.672994 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-tb8kn_c2e86406-64eb-4c0c-8f9d-38b2a64ddc48/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:06:56 crc kubenswrapper[4825]: I1007 20:06:56.895615 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29331121-n45sq_48f51d5f-d0ae-4123-969f-8bda81cbfb85/keystone-cron/0.log" Oct 07 20:06:56 crc kubenswrapper[4825]: I1007 20:06:56.965020 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6b848888b7-8bpk8_a0cf82d8-414d-4486-9cef-be5b38e75745/keystone-api/0.log" Oct 07 20:06:57 crc kubenswrapper[4825]: I1007 20:06:57.055954 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_f8d0cfb5-1a41-442c-b8d3-b1f3e2d8418e/kube-state-metrics/0.log" Oct 07 20:06:57 crc kubenswrapper[4825]: I1007 20:06:57.193364 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-ltvrw_6a978ccd-af77-4892-9bae-0f87170eb4a1/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:06:57 crc kubenswrapper[4825]: I1007 20:06:57.738816 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d47b47d5-hc6q5_e07eddca-def8-4a86-8d72-0c916ba6b6c1/neutron-httpd/0.log" Oct 07 20:06:57 crc kubenswrapper[4825]: I1007 20:06:57.798485 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d47b47d5-hc6q5_e07eddca-def8-4a86-8d72-0c916ba6b6c1/neutron-api/0.log" Oct 07 20:06:57 crc kubenswrapper[4825]: I1007 20:06:57.944026 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-gr759_e22aff57-c4de-445a-b196-23d2e791a10f/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:06:58 crc kubenswrapper[4825]: I1007 20:06:58.387420 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6a7f4fc7-89f3-4e32-94fe-f4117c1ca522/nova-api-log/0.log" Oct 07 20:06:58 crc kubenswrapper[4825]: I1007 20:06:58.607486 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_395c3018-72fe-4e48-a92d-e98026e550a3/nova-cell0-conductor-conductor/0.log" Oct 07 20:06:58 crc kubenswrapper[4825]: I1007 20:06:58.643805 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6a7f4fc7-89f3-4e32-94fe-f4117c1ca522/nova-api-api/0.log" Oct 07 20:06:59 crc kubenswrapper[4825]: I1007 20:06:59.037601 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_df2bca65-1a0f-4e1c-ba16-5b18bb7e71b7/nova-cell1-conductor-conductor/0.log" Oct 07 20:06:59 crc kubenswrapper[4825]: I1007 20:06:59.065878 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_89c586f2-b817-4c06-92cf-8b7832e8acc6/nova-cell1-novncproxy-novncproxy/0.log" Oct 07 20:06:59 crc kubenswrapper[4825]: I1007 20:06:59.324844 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-l9nwl_3c1f45e7-330e-4c79-8609-2988aac67b05/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:06:59 crc kubenswrapper[4825]: I1007 20:06:59.361743 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c4c4cfbe-20c8-402c-90b0-040fbbb0d58e/nova-metadata-log/0.log" Oct 07 20:06:59 crc kubenswrapper[4825]: I1007 20:06:59.787910 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_146610c1-1e58-4a52-ba58-b190f53f4a03/nova-scheduler-scheduler/0.log" Oct 07 20:06:59 crc kubenswrapper[4825]: I1007 20:06:59.986266 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6451a8c0-c6b1-4098-846d-24fe8c26d849/mysql-bootstrap/0.log" Oct 07 20:07:00 crc kubenswrapper[4825]: I1007 20:07:00.795160 4825 scope.go:117] "RemoveContainer" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" Oct 07 20:07:00 crc kubenswrapper[4825]: E1007 20:07:00.796426 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 20:07:00 crc kubenswrapper[4825]: I1007 20:07:00.844406 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c4c4cfbe-20c8-402c-90b0-040fbbb0d58e/nova-metadata-metadata/0.log" Oct 07 20:07:00 crc kubenswrapper[4825]: I1007 20:07:00.942445 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6451a8c0-c6b1-4098-846d-24fe8c26d849/mysql-bootstrap/0.log" Oct 07 20:07:01 crc kubenswrapper[4825]: I1007 20:07:00.983465 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6451a8c0-c6b1-4098-846d-24fe8c26d849/galera/0.log" Oct 07 20:07:01 crc kubenswrapper[4825]: I1007 20:07:01.138134 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_fa0ba0a4-872f-4ebd-8ee1-0e57174648a9/mysql-bootstrap/0.log" Oct 07 20:07:01 crc kubenswrapper[4825]: I1007 20:07:01.317785 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_fa0ba0a4-872f-4ebd-8ee1-0e57174648a9/galera/0.log" Oct 07 20:07:01 crc kubenswrapper[4825]: I1007 20:07:01.348130 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_fa0ba0a4-872f-4ebd-8ee1-0e57174648a9/mysql-bootstrap/0.log" Oct 07 20:07:01 crc kubenswrapper[4825]: I1007 20:07:01.505582 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_44d41a47-16c3-4bd1-be08-b06bd6f8734f/openstackclient/0.log" Oct 07 20:07:01 crc kubenswrapper[4825]: I1007 20:07:01.735937 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-j2dqr_5476bb52-18e5-41e6-b087-3cd2d6e81a87/openstack-network-exporter/0.log" Oct 07 20:07:01 crc kubenswrapper[4825]: I1007 20:07:01.976375 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-mqtlv_0392f085-cd23-439c-b8aa-e3c94fc320b8/ovn-controller/0.log" Oct 07 20:07:02 crc kubenswrapper[4825]: I1007 20:07:02.377139 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9zcg2_16ff2637-d49f-4b3b-b3f4-b731b51e8875/ovsdb-server-init/0.log" Oct 07 20:07:02 crc kubenswrapper[4825]: I1007 20:07:02.557038 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9zcg2_16ff2637-d49f-4b3b-b3f4-b731b51e8875/ovs-vswitchd/0.log" Oct 07 20:07:02 crc kubenswrapper[4825]: I1007 20:07:02.572277 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9zcg2_16ff2637-d49f-4b3b-b3f4-b731b51e8875/ovsdb-server-init/0.log" Oct 07 20:07:02 crc kubenswrapper[4825]: I1007 20:07:02.597591 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9zcg2_16ff2637-d49f-4b3b-b3f4-b731b51e8875/ovsdb-server/0.log" Oct 07 20:07:02 crc kubenswrapper[4825]: I1007 20:07:02.864984 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-xxqvz_a32b4f65-af6c-4bed-a97c-ec9ced0b4c45/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:07:03 crc kubenswrapper[4825]: I1007 20:07:03.019694 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58/openstack-network-exporter/0.log" Oct 07 20:07:03 crc kubenswrapper[4825]: I1007 20:07:03.073623 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5b1f2dcf-57dd-4f0b-8221-a3738fcbdb58/ovn-northd/0.log" Oct 07 20:07:03 crc kubenswrapper[4825]: I1007 20:07:03.171812 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_96ff0bc5-e277-4f6a-a3b3-815e01ac42b7/openstack-network-exporter/0.log" Oct 07 20:07:03 crc kubenswrapper[4825]: I1007 20:07:03.240932 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_96ff0bc5-e277-4f6a-a3b3-815e01ac42b7/ovsdbserver-nb/0.log" Oct 07 20:07:03 crc kubenswrapper[4825]: I1007 20:07:03.344566 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bf48b556-d051-49b5-b9fb-fa6b325e0f79/openstack-network-exporter/0.log" Oct 07 20:07:03 crc kubenswrapper[4825]: I1007 20:07:03.423407 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bf48b556-d051-49b5-b9fb-fa6b325e0f79/ovsdbserver-sb/0.log" Oct 07 20:07:03 crc kubenswrapper[4825]: I1007 20:07:03.632670 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7f4b8c987b-kjdd8_4297247b-64e7-4379-aa35-9e2bf6d2d5d5/placement-api/0.log" Oct 07 20:07:03 crc kubenswrapper[4825]: I1007 20:07:03.721148 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7f4b8c987b-kjdd8_4297247b-64e7-4379-aa35-9e2bf6d2d5d5/placement-log/0.log" Oct 07 20:07:03 crc kubenswrapper[4825]: I1007 20:07:03.817472 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e773083b-ae36-44eb-bb82-18b12b504439/setup-container/0.log" Oct 07 20:07:04 crc kubenswrapper[4825]: I1007 20:07:04.060285 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e773083b-ae36-44eb-bb82-18b12b504439/setup-container/0.log" Oct 07 20:07:04 crc kubenswrapper[4825]: I1007 20:07:04.106354 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e773083b-ae36-44eb-bb82-18b12b504439/rabbitmq/0.log" Oct 07 20:07:04 crc kubenswrapper[4825]: I1007 20:07:04.268292 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_18c777f8-aad0-482a-b132-ad417d64eb6e/setup-container/0.log" Oct 07 20:07:04 crc kubenswrapper[4825]: I1007 20:07:04.527443 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_18c777f8-aad0-482a-b132-ad417d64eb6e/rabbitmq/0.log" Oct 07 20:07:04 crc kubenswrapper[4825]: I1007 20:07:04.538860 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_18c777f8-aad0-482a-b132-ad417d64eb6e/setup-container/0.log" Oct 07 20:07:04 crc kubenswrapper[4825]: I1007 20:07:04.750463 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-bdcwn_5cd31618-4e62-438f-b168-1d322052785d/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:07:04 crc kubenswrapper[4825]: I1007 20:07:04.765657 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-q2wcd_7f652c6b-fc94-47dc-90ec-a19d7e49d728/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:07:04 crc kubenswrapper[4825]: I1007 20:07:04.994752 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-xw8nw_00148c9a-f926-4ff0-a78a-239fae3968d5/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:07:05 crc kubenswrapper[4825]: I1007 20:07:05.180291 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-8kr2p_fdf4da6b-b218-4ab1-87c6-7b8cfcef6810/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:07:05 crc kubenswrapper[4825]: I1007 20:07:05.307376 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-d55wk_63e9c706-689c-43be-a9a3-67f20fbfea88/ssh-known-hosts-edpm-deployment/0.log" Oct 07 20:07:05 crc kubenswrapper[4825]: I1007 20:07:05.577838 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-57f8b4b869-t42c2_aeabd5f0-6573-402d-a5df-c0bc41d16a67/proxy-server/0.log" Oct 07 20:07:05 crc kubenswrapper[4825]: I1007 20:07:05.634347 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-57f8b4b869-t42c2_aeabd5f0-6573-402d-a5df-c0bc41d16a67/proxy-httpd/0.log" Oct 07 20:07:05 crc kubenswrapper[4825]: I1007 20:07:05.746373 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-whwz4_13a46859-41a7-4783-9c3d-be9e48db5526/swift-ring-rebalance/0.log" Oct 07 20:07:05 crc kubenswrapper[4825]: I1007 20:07:05.943767 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/account-auditor/0.log" Oct 07 20:07:05 crc kubenswrapper[4825]: I1007 20:07:05.959551 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/account-reaper/0.log" Oct 07 20:07:06 crc kubenswrapper[4825]: I1007 20:07:06.099823 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/account-replicator/0.log" Oct 07 20:07:06 crc kubenswrapper[4825]: I1007 20:07:06.144260 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/account-server/0.log" Oct 07 20:07:06 crc kubenswrapper[4825]: I1007 20:07:06.148153 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/container-auditor/0.log" Oct 07 20:07:06 crc kubenswrapper[4825]: I1007 20:07:06.314649 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/container-replicator/0.log" Oct 07 20:07:06 crc kubenswrapper[4825]: I1007 20:07:06.374123 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/container-updater/0.log" Oct 07 20:07:06 crc kubenswrapper[4825]: I1007 20:07:06.408014 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/container-server/0.log" Oct 07 20:07:06 crc kubenswrapper[4825]: I1007 20:07:06.543082 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/object-auditor/0.log" Oct 07 20:07:06 crc kubenswrapper[4825]: I1007 20:07:06.616062 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/object-expirer/0.log" Oct 07 20:07:06 crc kubenswrapper[4825]: I1007 20:07:06.653989 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/object-replicator/0.log" Oct 07 20:07:06 crc kubenswrapper[4825]: I1007 20:07:06.713016 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/object-server/0.log" Oct 07 20:07:06 crc kubenswrapper[4825]: I1007 20:07:06.870457 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/rsync/0.log" Oct 07 20:07:06 crc kubenswrapper[4825]: I1007 20:07:06.871926 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/object-updater/0.log" Oct 07 20:07:06 crc kubenswrapper[4825]: I1007 20:07:06.914603 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_43cb88e3-5a22-4562-86b0-b016c7ff1dcf/swift-recon-cron/0.log" Oct 07 20:07:07 crc kubenswrapper[4825]: I1007 20:07:07.105217 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-4lzsv_803e0c1d-979b-47da-ba63-cad0323972a8/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:07:07 crc kubenswrapper[4825]: I1007 20:07:07.504737 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_cf9823f2-5baf-49ba-9da5-a8f13ac66d75/tempest-tests-tempest-tests-runner/0.log" Oct 07 20:07:07 crc kubenswrapper[4825]: I1007 20:07:07.516518 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_6a1f604e-4e0b-42a3-a2c5-0b42417baa2f/test-operator-logs-container/0.log" Oct 07 20:07:07 crc kubenswrapper[4825]: I1007 20:07:07.702828 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-6fnnx_b0c642ee-a887-496b-a212-48601b94af99/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 20:07:12 crc kubenswrapper[4825]: I1007 20:07:12.794987 4825 scope.go:117] "RemoveContainer" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" Oct 07 20:07:12 crc kubenswrapper[4825]: E1007 20:07:12.795570 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 20:07:14 crc kubenswrapper[4825]: I1007 20:07:14.629693 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_8448c74b-bea3-42c0-95da-ab251a90ca9f/memcached/0.log" Oct 07 20:07:25 crc kubenswrapper[4825]: I1007 20:07:25.796054 4825 scope.go:117] "RemoveContainer" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" Oct 07 20:07:25 crc kubenswrapper[4825]: E1007 20:07:25.797020 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 20:07:36 crc kubenswrapper[4825]: I1007 20:07:36.897416 4825 generic.go:334] "Generic (PLEG): container finished" podID="832ab5bf-7796-4bbf-9eef-d71ed33702e9" containerID="7d6d95a4abf7b430388e75733ee4afbab8d5320a85c38548d5a3c8fe89a23220" exitCode=0 Oct 07 20:07:36 crc kubenswrapper[4825]: I1007 20:07:36.897561 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m2kj9/crc-debug-mttcq" event={"ID":"832ab5bf-7796-4bbf-9eef-d71ed33702e9","Type":"ContainerDied","Data":"7d6d95a4abf7b430388e75733ee4afbab8d5320a85c38548d5a3c8fe89a23220"} Oct 07 20:07:38 crc kubenswrapper[4825]: I1007 20:07:38.037021 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m2kj9/crc-debug-mttcq" Oct 07 20:07:38 crc kubenswrapper[4825]: I1007 20:07:38.087936 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-m2kj9/crc-debug-mttcq"] Oct 07 20:07:38 crc kubenswrapper[4825]: I1007 20:07:38.102527 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-m2kj9/crc-debug-mttcq"] Oct 07 20:07:38 crc kubenswrapper[4825]: I1007 20:07:38.182370 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp6lr\" (UniqueName: \"kubernetes.io/projected/832ab5bf-7796-4bbf-9eef-d71ed33702e9-kube-api-access-lp6lr\") pod \"832ab5bf-7796-4bbf-9eef-d71ed33702e9\" (UID: \"832ab5bf-7796-4bbf-9eef-d71ed33702e9\") " Oct 07 20:07:38 crc kubenswrapper[4825]: I1007 20:07:38.182660 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/832ab5bf-7796-4bbf-9eef-d71ed33702e9-host\") pod \"832ab5bf-7796-4bbf-9eef-d71ed33702e9\" (UID: \"832ab5bf-7796-4bbf-9eef-d71ed33702e9\") " Oct 07 20:07:38 crc kubenswrapper[4825]: I1007 20:07:38.182995 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/832ab5bf-7796-4bbf-9eef-d71ed33702e9-host" (OuterVolumeSpecName: "host") pod "832ab5bf-7796-4bbf-9eef-d71ed33702e9" (UID: "832ab5bf-7796-4bbf-9eef-d71ed33702e9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 20:07:38 crc kubenswrapper[4825]: I1007 20:07:38.183713 4825 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/832ab5bf-7796-4bbf-9eef-d71ed33702e9-host\") on node \"crc\" DevicePath \"\"" Oct 07 20:07:38 crc kubenswrapper[4825]: I1007 20:07:38.195603 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/832ab5bf-7796-4bbf-9eef-d71ed33702e9-kube-api-access-lp6lr" (OuterVolumeSpecName: "kube-api-access-lp6lr") pod "832ab5bf-7796-4bbf-9eef-d71ed33702e9" (UID: "832ab5bf-7796-4bbf-9eef-d71ed33702e9"). InnerVolumeSpecName "kube-api-access-lp6lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 20:07:38 crc kubenswrapper[4825]: I1007 20:07:38.285181 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp6lr\" (UniqueName: \"kubernetes.io/projected/832ab5bf-7796-4bbf-9eef-d71ed33702e9-kube-api-access-lp6lr\") on node \"crc\" DevicePath \"\"" Oct 07 20:07:38 crc kubenswrapper[4825]: I1007 20:07:38.796297 4825 scope.go:117] "RemoveContainer" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" Oct 07 20:07:38 crc kubenswrapper[4825]: E1007 20:07:38.796529 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 20:07:38 crc kubenswrapper[4825]: I1007 20:07:38.928152 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7382dbf187b9e569736a6145960b24df32053b9bc49e240eb6211e72011f3cd" Oct 07 20:07:38 crc kubenswrapper[4825]: I1007 20:07:38.928270 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m2kj9/crc-debug-mttcq" Oct 07 20:07:39 crc kubenswrapper[4825]: I1007 20:07:39.287151 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-m2kj9/crc-debug-cv8rp"] Oct 07 20:07:39 crc kubenswrapper[4825]: E1007 20:07:39.288694 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95e42bd1-2d47-4922-92e3-68080e356050" containerName="extract-content" Oct 07 20:07:39 crc kubenswrapper[4825]: I1007 20:07:39.288797 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="95e42bd1-2d47-4922-92e3-68080e356050" containerName="extract-content" Oct 07 20:07:39 crc kubenswrapper[4825]: E1007 20:07:39.288894 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e104a0-6878-460b-9420-f2dc01d20e7f" containerName="registry-server" Oct 07 20:07:39 crc kubenswrapper[4825]: I1007 20:07:39.288963 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e104a0-6878-460b-9420-f2dc01d20e7f" containerName="registry-server" Oct 07 20:07:39 crc kubenswrapper[4825]: E1007 20:07:39.289037 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="832ab5bf-7796-4bbf-9eef-d71ed33702e9" containerName="container-00" Oct 07 20:07:39 crc kubenswrapper[4825]: I1007 20:07:39.289106 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="832ab5bf-7796-4bbf-9eef-d71ed33702e9" containerName="container-00" Oct 07 20:07:39 crc kubenswrapper[4825]: E1007 20:07:39.289176 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e104a0-6878-460b-9420-f2dc01d20e7f" containerName="extract-utilities" Oct 07 20:07:39 crc kubenswrapper[4825]: I1007 20:07:39.289259 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e104a0-6878-460b-9420-f2dc01d20e7f" containerName="extract-utilities" Oct 07 20:07:39 crc kubenswrapper[4825]: E1007 20:07:39.289345 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95e42bd1-2d47-4922-92e3-68080e356050" containerName="registry-server" Oct 07 20:07:39 crc kubenswrapper[4825]: I1007 20:07:39.289413 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="95e42bd1-2d47-4922-92e3-68080e356050" containerName="registry-server" Oct 07 20:07:39 crc kubenswrapper[4825]: E1007 20:07:39.289503 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e104a0-6878-460b-9420-f2dc01d20e7f" containerName="extract-content" Oct 07 20:07:39 crc kubenswrapper[4825]: I1007 20:07:39.289627 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e104a0-6878-460b-9420-f2dc01d20e7f" containerName="extract-content" Oct 07 20:07:39 crc kubenswrapper[4825]: E1007 20:07:39.290111 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95e42bd1-2d47-4922-92e3-68080e356050" containerName="extract-utilities" Oct 07 20:07:39 crc kubenswrapper[4825]: I1007 20:07:39.290204 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="95e42bd1-2d47-4922-92e3-68080e356050" containerName="extract-utilities" Oct 07 20:07:39 crc kubenswrapper[4825]: I1007 20:07:39.290541 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="95e42bd1-2d47-4922-92e3-68080e356050" containerName="registry-server" Oct 07 20:07:39 crc kubenswrapper[4825]: I1007 20:07:39.290628 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="50e104a0-6878-460b-9420-f2dc01d20e7f" containerName="registry-server" Oct 07 20:07:39 crc kubenswrapper[4825]: I1007 20:07:39.290718 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="832ab5bf-7796-4bbf-9eef-d71ed33702e9" containerName="container-00" Oct 07 20:07:39 crc kubenswrapper[4825]: I1007 20:07:39.291535 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m2kj9/crc-debug-cv8rp" Oct 07 20:07:39 crc kubenswrapper[4825]: I1007 20:07:39.407787 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0eea763-e545-4dce-9d2c-3934bb8406a9-host\") pod \"crc-debug-cv8rp\" (UID: \"c0eea763-e545-4dce-9d2c-3934bb8406a9\") " pod="openshift-must-gather-m2kj9/crc-debug-cv8rp" Oct 07 20:07:39 crc kubenswrapper[4825]: I1007 20:07:39.408059 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2vgx\" (UniqueName: \"kubernetes.io/projected/c0eea763-e545-4dce-9d2c-3934bb8406a9-kube-api-access-q2vgx\") pod \"crc-debug-cv8rp\" (UID: \"c0eea763-e545-4dce-9d2c-3934bb8406a9\") " pod="openshift-must-gather-m2kj9/crc-debug-cv8rp" Oct 07 20:07:39 crc kubenswrapper[4825]: I1007 20:07:39.510315 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2vgx\" (UniqueName: \"kubernetes.io/projected/c0eea763-e545-4dce-9d2c-3934bb8406a9-kube-api-access-q2vgx\") pod \"crc-debug-cv8rp\" (UID: \"c0eea763-e545-4dce-9d2c-3934bb8406a9\") " pod="openshift-must-gather-m2kj9/crc-debug-cv8rp" Oct 07 20:07:39 crc kubenswrapper[4825]: I1007 20:07:39.510552 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0eea763-e545-4dce-9d2c-3934bb8406a9-host\") pod \"crc-debug-cv8rp\" (UID: \"c0eea763-e545-4dce-9d2c-3934bb8406a9\") " pod="openshift-must-gather-m2kj9/crc-debug-cv8rp" Oct 07 20:07:39 crc kubenswrapper[4825]: I1007 20:07:39.510679 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0eea763-e545-4dce-9d2c-3934bb8406a9-host\") pod \"crc-debug-cv8rp\" (UID: \"c0eea763-e545-4dce-9d2c-3934bb8406a9\") " pod="openshift-must-gather-m2kj9/crc-debug-cv8rp" Oct 07 20:07:39 crc kubenswrapper[4825]: I1007 20:07:39.543989 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2vgx\" (UniqueName: \"kubernetes.io/projected/c0eea763-e545-4dce-9d2c-3934bb8406a9-kube-api-access-q2vgx\") pod \"crc-debug-cv8rp\" (UID: \"c0eea763-e545-4dce-9d2c-3934bb8406a9\") " pod="openshift-must-gather-m2kj9/crc-debug-cv8rp" Oct 07 20:07:39 crc kubenswrapper[4825]: I1007 20:07:39.608561 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m2kj9/crc-debug-cv8rp" Oct 07 20:07:39 crc kubenswrapper[4825]: I1007 20:07:39.819094 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="832ab5bf-7796-4bbf-9eef-d71ed33702e9" path="/var/lib/kubelet/pods/832ab5bf-7796-4bbf-9eef-d71ed33702e9/volumes" Oct 07 20:07:39 crc kubenswrapper[4825]: I1007 20:07:39.944561 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m2kj9/crc-debug-cv8rp" event={"ID":"c0eea763-e545-4dce-9d2c-3934bb8406a9","Type":"ContainerStarted","Data":"6390b19e56c1c220242f50c33478485828ea1b40679906357a3964682587b72b"} Oct 07 20:07:40 crc kubenswrapper[4825]: I1007 20:07:40.965176 4825 generic.go:334] "Generic (PLEG): container finished" podID="c0eea763-e545-4dce-9d2c-3934bb8406a9" containerID="e6eb3f71c7fb6b22857b9f59e88d2fe59f65755e57b39b26045266066f0f4fe8" exitCode=0 Oct 07 20:07:40 crc kubenswrapper[4825]: I1007 20:07:40.965584 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m2kj9/crc-debug-cv8rp" event={"ID":"c0eea763-e545-4dce-9d2c-3934bb8406a9","Type":"ContainerDied","Data":"e6eb3f71c7fb6b22857b9f59e88d2fe59f65755e57b39b26045266066f0f4fe8"} Oct 07 20:07:42 crc kubenswrapper[4825]: I1007 20:07:42.077180 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m2kj9/crc-debug-cv8rp" Oct 07 20:07:42 crc kubenswrapper[4825]: I1007 20:07:42.157084 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2vgx\" (UniqueName: \"kubernetes.io/projected/c0eea763-e545-4dce-9d2c-3934bb8406a9-kube-api-access-q2vgx\") pod \"c0eea763-e545-4dce-9d2c-3934bb8406a9\" (UID: \"c0eea763-e545-4dce-9d2c-3934bb8406a9\") " Oct 07 20:07:42 crc kubenswrapper[4825]: I1007 20:07:42.157116 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0eea763-e545-4dce-9d2c-3934bb8406a9-host\") pod \"c0eea763-e545-4dce-9d2c-3934bb8406a9\" (UID: \"c0eea763-e545-4dce-9d2c-3934bb8406a9\") " Oct 07 20:07:42 crc kubenswrapper[4825]: I1007 20:07:42.157499 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c0eea763-e545-4dce-9d2c-3934bb8406a9-host" (OuterVolumeSpecName: "host") pod "c0eea763-e545-4dce-9d2c-3934bb8406a9" (UID: "c0eea763-e545-4dce-9d2c-3934bb8406a9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 20:07:42 crc kubenswrapper[4825]: I1007 20:07:42.158150 4825 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0eea763-e545-4dce-9d2c-3934bb8406a9-host\") on node \"crc\" DevicePath \"\"" Oct 07 20:07:42 crc kubenswrapper[4825]: I1007 20:07:42.550656 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0eea763-e545-4dce-9d2c-3934bb8406a9-kube-api-access-q2vgx" (OuterVolumeSpecName: "kube-api-access-q2vgx") pod "c0eea763-e545-4dce-9d2c-3934bb8406a9" (UID: "c0eea763-e545-4dce-9d2c-3934bb8406a9"). InnerVolumeSpecName "kube-api-access-q2vgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 20:07:42 crc kubenswrapper[4825]: I1007 20:07:42.563480 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2vgx\" (UniqueName: \"kubernetes.io/projected/c0eea763-e545-4dce-9d2c-3934bb8406a9-kube-api-access-q2vgx\") on node \"crc\" DevicePath \"\"" Oct 07 20:07:42 crc kubenswrapper[4825]: I1007 20:07:42.981587 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m2kj9/crc-debug-cv8rp" event={"ID":"c0eea763-e545-4dce-9d2c-3934bb8406a9","Type":"ContainerDied","Data":"6390b19e56c1c220242f50c33478485828ea1b40679906357a3964682587b72b"} Oct 07 20:07:42 crc kubenswrapper[4825]: I1007 20:07:42.981624 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6390b19e56c1c220242f50c33478485828ea1b40679906357a3964682587b72b" Oct 07 20:07:42 crc kubenswrapper[4825]: I1007 20:07:42.981643 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m2kj9/crc-debug-cv8rp" Oct 07 20:07:47 crc kubenswrapper[4825]: I1007 20:07:47.537667 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-m2kj9/crc-debug-cv8rp"] Oct 07 20:07:47 crc kubenswrapper[4825]: I1007 20:07:47.544500 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-m2kj9/crc-debug-cv8rp"] Oct 07 20:07:47 crc kubenswrapper[4825]: I1007 20:07:47.830597 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0eea763-e545-4dce-9d2c-3934bb8406a9" path="/var/lib/kubelet/pods/c0eea763-e545-4dce-9d2c-3934bb8406a9/volumes" Oct 07 20:07:48 crc kubenswrapper[4825]: I1007 20:07:48.758375 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-m2kj9/crc-debug-6w6nq"] Oct 07 20:07:48 crc kubenswrapper[4825]: E1007 20:07:48.759462 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0eea763-e545-4dce-9d2c-3934bb8406a9" containerName="container-00" Oct 07 20:07:48 crc kubenswrapper[4825]: I1007 20:07:48.759485 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0eea763-e545-4dce-9d2c-3934bb8406a9" containerName="container-00" Oct 07 20:07:48 crc kubenswrapper[4825]: I1007 20:07:48.759938 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0eea763-e545-4dce-9d2c-3934bb8406a9" containerName="container-00" Oct 07 20:07:48 crc kubenswrapper[4825]: I1007 20:07:48.761010 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m2kj9/crc-debug-6w6nq" Oct 07 20:07:48 crc kubenswrapper[4825]: I1007 20:07:48.858466 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c063b479-3e76-4a83-a27a-3c7ea2c298c5-host\") pod \"crc-debug-6w6nq\" (UID: \"c063b479-3e76-4a83-a27a-3c7ea2c298c5\") " pod="openshift-must-gather-m2kj9/crc-debug-6w6nq" Oct 07 20:07:48 crc kubenswrapper[4825]: I1007 20:07:48.858648 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7lh7\" (UniqueName: \"kubernetes.io/projected/c063b479-3e76-4a83-a27a-3c7ea2c298c5-kube-api-access-d7lh7\") pod \"crc-debug-6w6nq\" (UID: \"c063b479-3e76-4a83-a27a-3c7ea2c298c5\") " pod="openshift-must-gather-m2kj9/crc-debug-6w6nq" Oct 07 20:07:48 crc kubenswrapper[4825]: I1007 20:07:48.960886 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c063b479-3e76-4a83-a27a-3c7ea2c298c5-host\") pod \"crc-debug-6w6nq\" (UID: \"c063b479-3e76-4a83-a27a-3c7ea2c298c5\") " pod="openshift-must-gather-m2kj9/crc-debug-6w6nq" Oct 07 20:07:48 crc kubenswrapper[4825]: I1007 20:07:48.960992 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7lh7\" (UniqueName: \"kubernetes.io/projected/c063b479-3e76-4a83-a27a-3c7ea2c298c5-kube-api-access-d7lh7\") pod \"crc-debug-6w6nq\" (UID: \"c063b479-3e76-4a83-a27a-3c7ea2c298c5\") " pod="openshift-must-gather-m2kj9/crc-debug-6w6nq" Oct 07 20:07:48 crc kubenswrapper[4825]: I1007 20:07:48.961045 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c063b479-3e76-4a83-a27a-3c7ea2c298c5-host\") pod \"crc-debug-6w6nq\" (UID: \"c063b479-3e76-4a83-a27a-3c7ea2c298c5\") " pod="openshift-must-gather-m2kj9/crc-debug-6w6nq" Oct 07 20:07:48 crc kubenswrapper[4825]: I1007 20:07:48.986818 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7lh7\" (UniqueName: \"kubernetes.io/projected/c063b479-3e76-4a83-a27a-3c7ea2c298c5-kube-api-access-d7lh7\") pod \"crc-debug-6w6nq\" (UID: \"c063b479-3e76-4a83-a27a-3c7ea2c298c5\") " pod="openshift-must-gather-m2kj9/crc-debug-6w6nq" Oct 07 20:07:49 crc kubenswrapper[4825]: I1007 20:07:49.090210 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m2kj9/crc-debug-6w6nq" Oct 07 20:07:50 crc kubenswrapper[4825]: I1007 20:07:50.050716 4825 generic.go:334] "Generic (PLEG): container finished" podID="c063b479-3e76-4a83-a27a-3c7ea2c298c5" containerID="ff07fa3450b43ec50085e9e4d46bf3a49b487fcffd1d3676875800239cbe1e39" exitCode=0 Oct 07 20:07:50 crc kubenswrapper[4825]: I1007 20:07:50.050869 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m2kj9/crc-debug-6w6nq" event={"ID":"c063b479-3e76-4a83-a27a-3c7ea2c298c5","Type":"ContainerDied","Data":"ff07fa3450b43ec50085e9e4d46bf3a49b487fcffd1d3676875800239cbe1e39"} Oct 07 20:07:50 crc kubenswrapper[4825]: I1007 20:07:50.052413 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m2kj9/crc-debug-6w6nq" event={"ID":"c063b479-3e76-4a83-a27a-3c7ea2c298c5","Type":"ContainerStarted","Data":"49e247dd6933f338ffa7950b1e614dd26ba6b83414c306e6d4c94f25f054548a"} Oct 07 20:07:50 crc kubenswrapper[4825]: I1007 20:07:50.111242 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-m2kj9/crc-debug-6w6nq"] Oct 07 20:07:50 crc kubenswrapper[4825]: I1007 20:07:50.126022 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-m2kj9/crc-debug-6w6nq"] Oct 07 20:07:50 crc kubenswrapper[4825]: I1007 20:07:50.587983 4825 scope.go:117] "RemoveContainer" containerID="d6a45b03eeb4a94960d14b07f6be694d11743377ed547faee8f74683cc3c0def" Oct 07 20:07:51 crc kubenswrapper[4825]: I1007 20:07:51.163833 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m2kj9/crc-debug-6w6nq" Oct 07 20:07:51 crc kubenswrapper[4825]: I1007 20:07:51.305997 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c063b479-3e76-4a83-a27a-3c7ea2c298c5-host\") pod \"c063b479-3e76-4a83-a27a-3c7ea2c298c5\" (UID: \"c063b479-3e76-4a83-a27a-3c7ea2c298c5\") " Oct 07 20:07:51 crc kubenswrapper[4825]: I1007 20:07:51.306122 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c063b479-3e76-4a83-a27a-3c7ea2c298c5-host" (OuterVolumeSpecName: "host") pod "c063b479-3e76-4a83-a27a-3c7ea2c298c5" (UID: "c063b479-3e76-4a83-a27a-3c7ea2c298c5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 20:07:51 crc kubenswrapper[4825]: I1007 20:07:51.306257 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7lh7\" (UniqueName: \"kubernetes.io/projected/c063b479-3e76-4a83-a27a-3c7ea2c298c5-kube-api-access-d7lh7\") pod \"c063b479-3e76-4a83-a27a-3c7ea2c298c5\" (UID: \"c063b479-3e76-4a83-a27a-3c7ea2c298c5\") " Oct 07 20:07:51 crc kubenswrapper[4825]: I1007 20:07:51.306717 4825 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c063b479-3e76-4a83-a27a-3c7ea2c298c5-host\") on node \"crc\" DevicePath \"\"" Oct 07 20:07:51 crc kubenswrapper[4825]: I1007 20:07:51.322516 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c063b479-3e76-4a83-a27a-3c7ea2c298c5-kube-api-access-d7lh7" (OuterVolumeSpecName: "kube-api-access-d7lh7") pod "c063b479-3e76-4a83-a27a-3c7ea2c298c5" (UID: "c063b479-3e76-4a83-a27a-3c7ea2c298c5"). InnerVolumeSpecName "kube-api-access-d7lh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 20:07:51 crc kubenswrapper[4825]: I1007 20:07:51.408713 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7lh7\" (UniqueName: \"kubernetes.io/projected/c063b479-3e76-4a83-a27a-3c7ea2c298c5-kube-api-access-d7lh7\") on node \"crc\" DevicePath \"\"" Oct 07 20:07:51 crc kubenswrapper[4825]: I1007 20:07:51.754137 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2_513bdedd-0708-4b31-afc3-93beee6324dd/util/0.log" Oct 07 20:07:51 crc kubenswrapper[4825]: I1007 20:07:51.814250 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c063b479-3e76-4a83-a27a-3c7ea2c298c5" path="/var/lib/kubelet/pods/c063b479-3e76-4a83-a27a-3c7ea2c298c5/volumes" Oct 07 20:07:51 crc kubenswrapper[4825]: I1007 20:07:51.978612 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2_513bdedd-0708-4b31-afc3-93beee6324dd/pull/0.log" Oct 07 20:07:51 crc kubenswrapper[4825]: I1007 20:07:51.984011 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2_513bdedd-0708-4b31-afc3-93beee6324dd/util/0.log" Oct 07 20:07:51 crc kubenswrapper[4825]: I1007 20:07:51.998557 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2_513bdedd-0708-4b31-afc3-93beee6324dd/pull/0.log" Oct 07 20:07:52 crc kubenswrapper[4825]: I1007 20:07:52.070520 4825 scope.go:117] "RemoveContainer" containerID="ff07fa3450b43ec50085e9e4d46bf3a49b487fcffd1d3676875800239cbe1e39" Oct 07 20:07:52 crc kubenswrapper[4825]: I1007 20:07:52.070554 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m2kj9/crc-debug-6w6nq" Oct 07 20:07:52 crc kubenswrapper[4825]: I1007 20:07:52.170454 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2_513bdedd-0708-4b31-afc3-93beee6324dd/util/0.log" Oct 07 20:07:52 crc kubenswrapper[4825]: I1007 20:07:52.207939 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2_513bdedd-0708-4b31-afc3-93beee6324dd/pull/0.log" Oct 07 20:07:52 crc kubenswrapper[4825]: I1007 20:07:52.223527 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0b7jk2_513bdedd-0708-4b31-afc3-93beee6324dd/extract/0.log" Oct 07 20:07:52 crc kubenswrapper[4825]: I1007 20:07:52.341149 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-mrnwx_97dc66cd-4313-4951-b85c-dedd5cd2e6ba/kube-rbac-proxy/0.log" Oct 07 20:07:52 crc kubenswrapper[4825]: I1007 20:07:52.423894 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-mrnwx_97dc66cd-4313-4951-b85c-dedd5cd2e6ba/manager/0.log" Oct 07 20:07:52 crc kubenswrapper[4825]: I1007 20:07:52.449032 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-v4dk9_ef71a3c8-e986-4f19-a234-9e9ef7749132/kube-rbac-proxy/0.log" Oct 07 20:07:52 crc kubenswrapper[4825]: I1007 20:07:52.571124 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-v4dk9_ef71a3c8-e986-4f19-a234-9e9ef7749132/manager/0.log" Oct 07 20:07:52 crc kubenswrapper[4825]: I1007 20:07:52.603515 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-ln2cd_c310d873-fc90-4658-ab06-ffa16a97c784/kube-rbac-proxy/0.log" Oct 07 20:07:53 crc kubenswrapper[4825]: I1007 20:07:53.478711 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-ln2cd_c310d873-fc90-4658-ab06-ffa16a97c784/manager/0.log" Oct 07 20:07:53 crc kubenswrapper[4825]: I1007 20:07:53.571552 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-n28d6_a0f7df98-caae-40a5-bb89-94123bce0763/kube-rbac-proxy/0.log" Oct 07 20:07:53 crc kubenswrapper[4825]: I1007 20:07:53.599026 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-n28d6_a0f7df98-caae-40a5-bb89-94123bce0763/manager/0.log" Oct 07 20:07:53 crc kubenswrapper[4825]: I1007 20:07:53.713623 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-jrh49_62bd3185-8c68-419d-b523-2de43d8dd015/kube-rbac-proxy/0.log" Oct 07 20:07:53 crc kubenswrapper[4825]: I1007 20:07:53.777983 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-jrh49_62bd3185-8c68-419d-b523-2de43d8dd015/manager/0.log" Oct 07 20:07:53 crc kubenswrapper[4825]: I1007 20:07:53.795321 4825 scope.go:117] "RemoveContainer" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" Oct 07 20:07:53 crc kubenswrapper[4825]: E1007 20:07:53.795574 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 20:07:53 crc kubenswrapper[4825]: I1007 20:07:53.865402 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-b9tp8_282406b3-2501-4b01-adf1-d952fc240404/kube-rbac-proxy/0.log" Oct 07 20:07:53 crc kubenswrapper[4825]: I1007 20:07:53.937716 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-b9tp8_282406b3-2501-4b01-adf1-d952fc240404/manager/0.log" Oct 07 20:07:54 crc kubenswrapper[4825]: I1007 20:07:54.000921 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-f9pdx_1b11f862-ee30-4996-a8fb-218b3c27f07a/kube-rbac-proxy/0.log" Oct 07 20:07:54 crc kubenswrapper[4825]: I1007 20:07:54.182649 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-m87x4_3f63b792-0ed9-453e-8dff-afac52bac339/kube-rbac-proxy/0.log" Oct 07 20:07:54 crc kubenswrapper[4825]: I1007 20:07:54.220895 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-m87x4_3f63b792-0ed9-453e-8dff-afac52bac339/manager/0.log" Oct 07 20:07:54 crc kubenswrapper[4825]: I1007 20:07:54.232402 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-f9pdx_1b11f862-ee30-4996-a8fb-218b3c27f07a/manager/0.log" Oct 07 20:07:54 crc kubenswrapper[4825]: I1007 20:07:54.348865 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-5mng4_528dd884-a7df-4574-920f-86ae0d779b62/kube-rbac-proxy/0.log" Oct 07 20:07:54 crc kubenswrapper[4825]: I1007 20:07:54.443273 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-5mng4_528dd884-a7df-4574-920f-86ae0d779b62/manager/0.log" Oct 07 20:07:54 crc kubenswrapper[4825]: I1007 20:07:54.482723 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-9nndm_9e61b6db-a40e-4ce3-8086-e51bbc6f6295/kube-rbac-proxy/0.log" Oct 07 20:07:54 crc kubenswrapper[4825]: I1007 20:07:54.501121 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-9nndm_9e61b6db-a40e-4ce3-8086-e51bbc6f6295/manager/0.log" Oct 07 20:07:55 crc kubenswrapper[4825]: I1007 20:07:55.572596 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm_22f18be3-b165-4b14-90bd-3eac19ae3fee/kube-rbac-proxy/0.log" Oct 07 20:07:55 crc kubenswrapper[4825]: I1007 20:07:55.587501 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-fpkzs_01529960-5bd1-4a4d-8703-8d6a3ff38d4b/kube-rbac-proxy/0.log" Oct 07 20:07:55 crc kubenswrapper[4825]: I1007 20:07:55.599546 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-lpzwm_22f18be3-b165-4b14-90bd-3eac19ae3fee/manager/0.log" Oct 07 20:07:55 crc kubenswrapper[4825]: I1007 20:07:55.635167 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-fpkzs_01529960-5bd1-4a4d-8703-8d6a3ff38d4b/manager/0.log" Oct 07 20:07:55 crc kubenswrapper[4825]: I1007 20:07:55.772720 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-vn86z_0caa8db7-d83d-47bd-9276-29102dd20de8/kube-rbac-proxy/0.log" Oct 07 20:07:55 crc kubenswrapper[4825]: I1007 20:07:55.838149 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-hjgxc_0ecb1a32-2936-470c-a9c5-6701d461cd71/kube-rbac-proxy/0.log" Oct 07 20:07:55 crc kubenswrapper[4825]: I1007 20:07:55.863979 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-vn86z_0caa8db7-d83d-47bd-9276-29102dd20de8/manager/0.log" Oct 07 20:07:55 crc kubenswrapper[4825]: I1007 20:07:55.989613 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-hjgxc_0ecb1a32-2936-470c-a9c5-6701d461cd71/manager/0.log" Oct 07 20:07:56 crc kubenswrapper[4825]: I1007 20:07:56.036440 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv_8cacb372-6381-4182-92eb-81e607f7cf31/kube-rbac-proxy/0.log" Oct 07 20:07:56 crc kubenswrapper[4825]: I1007 20:07:56.038770 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665c8kvgv_8cacb372-6381-4182-92eb-81e607f7cf31/manager/0.log" Oct 07 20:07:56 crc kubenswrapper[4825]: I1007 20:07:56.159094 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-77dffbdc98-rxjhm_3463b9a9-3935-4a41-b710-77084296fa18/kube-rbac-proxy/0.log" Oct 07 20:07:56 crc kubenswrapper[4825]: I1007 20:07:56.229437 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6687d89476-w4tpc_0b0eb630-7794-4425-9ada-29b15acb6bdb/kube-rbac-proxy/0.log" Oct 07 20:07:56 crc kubenswrapper[4825]: I1007 20:07:56.384215 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6687d89476-w4tpc_0b0eb630-7794-4425-9ada-29b15acb6bdb/operator/0.log" Oct 07 20:07:56 crc kubenswrapper[4825]: I1007 20:07:56.395540 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-l89d7_fd7a1b83-b50f-41c1-8092-ce7135ffe155/registry-server/0.log" Oct 07 20:07:56 crc kubenswrapper[4825]: I1007 20:07:56.558255 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-9llfs_844cfe74-a770-4268-a60a-372586ac0744/kube-rbac-proxy/0.log" Oct 07 20:07:56 crc kubenswrapper[4825]: I1007 20:07:56.657596 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-9llfs_844cfe74-a770-4268-a60a-372586ac0744/manager/0.log" Oct 07 20:07:56 crc kubenswrapper[4825]: I1007 20:07:56.665678 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-lp8qp_3b8778b6-81a2-4e3c-b464-6e5c8e063a4b/kube-rbac-proxy/0.log" Oct 07 20:07:56 crc kubenswrapper[4825]: I1007 20:07:56.790274 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-lp8qp_3b8778b6-81a2-4e3c-b464-6e5c8e063a4b/manager/0.log" Oct 07 20:07:56 crc kubenswrapper[4825]: I1007 20:07:56.858362 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-rgt49_063bceb1-c26d-453a-a74a-e6874c273034/operator/0.log" Oct 07 20:07:57 crc kubenswrapper[4825]: I1007 20:07:57.037784 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-bs76l_c720cafe-11e6-4959-8228-b03cdb65242d/kube-rbac-proxy/0.log" Oct 07 20:07:57 crc kubenswrapper[4825]: I1007 20:07:57.080380 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-bs76l_c720cafe-11e6-4959-8228-b03cdb65242d/manager/0.log" Oct 07 20:07:57 crc kubenswrapper[4825]: I1007 20:07:57.198549 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-77dffbdc98-rxjhm_3463b9a9-3935-4a41-b710-77084296fa18/manager/0.log" Oct 07 20:07:57 crc kubenswrapper[4825]: I1007 20:07:57.202982 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-jvj5c_7f5bc608-3853-4a58-ac8d-18f57baffe4c/kube-rbac-proxy/0.log" Oct 07 20:07:57 crc kubenswrapper[4825]: I1007 20:07:57.299864 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-jvj5c_7f5bc608-3853-4a58-ac8d-18f57baffe4c/manager/0.log" Oct 07 20:07:57 crc kubenswrapper[4825]: I1007 20:07:57.317130 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-lg8z7_bd6d051a-119d-45c5-9b81-939bba328c56/kube-rbac-proxy/0.log" Oct 07 20:07:57 crc kubenswrapper[4825]: I1007 20:07:57.362621 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-lg8z7_bd6d051a-119d-45c5-9b81-939bba328c56/manager/0.log" Oct 07 20:07:57 crc kubenswrapper[4825]: I1007 20:07:57.455105 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-zbnmp_f99d1a15-090e-4a5e-a210-690be64c4742/kube-rbac-proxy/0.log" Oct 07 20:07:57 crc kubenswrapper[4825]: I1007 20:07:57.489574 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-zbnmp_f99d1a15-090e-4a5e-a210-690be64c4742/manager/0.log" Oct 07 20:08:07 crc kubenswrapper[4825]: I1007 20:08:07.796131 4825 scope.go:117] "RemoveContainer" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" Oct 07 20:08:07 crc kubenswrapper[4825]: E1007 20:08:07.798880 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 20:08:14 crc kubenswrapper[4825]: I1007 20:08:14.594074 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-mhcpj_962646e1-6f06-40ed-a19a-d73f55b93d95/control-plane-machine-set-operator/0.log" Oct 07 20:08:14 crc kubenswrapper[4825]: I1007 20:08:14.745125 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7hvsq_b3cea192-f8e9-426c-887e-68a8d8f2dad5/kube-rbac-proxy/0.log" Oct 07 20:08:14 crc kubenswrapper[4825]: I1007 20:08:14.752442 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7hvsq_b3cea192-f8e9-426c-887e-68a8d8f2dad5/machine-api-operator/0.log" Oct 07 20:08:22 crc kubenswrapper[4825]: I1007 20:08:22.795871 4825 scope.go:117] "RemoveContainer" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" Oct 07 20:08:22 crc kubenswrapper[4825]: E1007 20:08:22.796979 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b6jcs_openshift-machine-config-operator(a57a780f-aa1f-4e0f-9a90-5e6a70f89d18)\"" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" Oct 07 20:08:29 crc kubenswrapper[4825]: I1007 20:08:29.395465 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-vhchg_1ff41e8d-639e-4710-a863-1c6dbec99768/cert-manager-controller/0.log" Oct 07 20:08:30 crc kubenswrapper[4825]: I1007 20:08:30.373001 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-ff56x_491e6da2-5d0d-4a47-abda-467d60d5ec14/cert-manager-cainjector/0.log" Oct 07 20:08:30 crc kubenswrapper[4825]: I1007 20:08:30.474442 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-ph62w_a95caa53-91d1-4d61-872a-c0ff3539d4d7/cert-manager-webhook/0.log" Oct 07 20:08:37 crc kubenswrapper[4825]: I1007 20:08:37.795074 4825 scope.go:117] "RemoveContainer" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" Oct 07 20:08:38 crc kubenswrapper[4825]: I1007 20:08:38.588553 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerStarted","Data":"e132f4b7438804489e938ee5565a903c6ef55634d044c2bf923546dacce3dc25"} Oct 07 20:08:43 crc kubenswrapper[4825]: I1007 20:08:43.348410 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-dzb8f_bf7784c9-07ce-45f5-ad97-788cf3ef3b36/nmstate-console-plugin/0.log" Oct 07 20:08:43 crc kubenswrapper[4825]: I1007 20:08:43.495799 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-v8dxh_2e774a23-bfdc-466c-92ed-4a9eb8f74c44/nmstate-handler/0.log" Oct 07 20:08:43 crc kubenswrapper[4825]: I1007 20:08:43.518245 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-vv8pn_fef914bd-768f-4cd2-92c1-b5fb49e63ca8/kube-rbac-proxy/0.log" Oct 07 20:08:43 crc kubenswrapper[4825]: I1007 20:08:43.573648 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-vv8pn_fef914bd-768f-4cd2-92c1-b5fb49e63ca8/nmstate-metrics/0.log" Oct 07 20:08:43 crc kubenswrapper[4825]: I1007 20:08:43.689066 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-8lh75_00238ddf-6ee8-44a7-97a3-7d1563e1a1d7/nmstate-operator/0.log" Oct 07 20:08:43 crc kubenswrapper[4825]: I1007 20:08:43.806742 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-4kjx7_a194e8ec-fa8a-4efb-af70-ea121bb7d835/nmstate-webhook/0.log" Oct 07 20:08:57 crc kubenswrapper[4825]: I1007 20:08:57.576989 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-6vp5x_5c658b5b-d9ac-4877-894a-770c7fefcf5e/kube-rbac-proxy/0.log" Oct 07 20:08:57 crc kubenswrapper[4825]: I1007 20:08:57.638314 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-6vp5x_5c658b5b-d9ac-4877-894a-770c7fefcf5e/controller/0.log" Oct 07 20:08:57 crc kubenswrapper[4825]: I1007 20:08:57.750429 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/cp-frr-files/0.log" Oct 07 20:08:57 crc kubenswrapper[4825]: I1007 20:08:57.915479 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/cp-frr-files/0.log" Oct 07 20:08:57 crc kubenswrapper[4825]: I1007 20:08:57.922871 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/cp-reloader/0.log" Oct 07 20:08:57 crc kubenswrapper[4825]: I1007 20:08:57.931428 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/cp-reloader/0.log" Oct 07 20:08:57 crc kubenswrapper[4825]: I1007 20:08:57.955638 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/cp-metrics/0.log" Oct 07 20:08:58 crc kubenswrapper[4825]: I1007 20:08:58.094541 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/cp-frr-files/0.log" Oct 07 20:08:58 crc kubenswrapper[4825]: I1007 20:08:58.106494 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/cp-metrics/0.log" Oct 07 20:08:58 crc kubenswrapper[4825]: I1007 20:08:58.121120 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/cp-reloader/0.log" Oct 07 20:08:58 crc kubenswrapper[4825]: I1007 20:08:58.121817 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/cp-metrics/0.log" Oct 07 20:08:58 crc kubenswrapper[4825]: I1007 20:08:58.268969 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/cp-frr-files/0.log" Oct 07 20:08:58 crc kubenswrapper[4825]: I1007 20:08:58.294686 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/cp-reloader/0.log" Oct 07 20:08:58 crc kubenswrapper[4825]: I1007 20:08:58.294714 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/cp-metrics/0.log" Oct 07 20:08:58 crc kubenswrapper[4825]: I1007 20:08:58.303517 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/controller/0.log" Oct 07 20:08:58 crc kubenswrapper[4825]: I1007 20:08:58.470360 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/frr-metrics/0.log" Oct 07 20:08:58 crc kubenswrapper[4825]: I1007 20:08:58.534134 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/kube-rbac-proxy-frr/0.log" Oct 07 20:08:58 crc kubenswrapper[4825]: I1007 20:08:58.539698 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/kube-rbac-proxy/0.log" Oct 07 20:08:58 crc kubenswrapper[4825]: I1007 20:08:58.725982 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-sd7s7_15d8205a-b357-40f1-813d-e42c9d6ac2f0/frr-k8s-webhook-server/0.log" Oct 07 20:08:58 crc kubenswrapper[4825]: I1007 20:08:58.738554 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/reloader/0.log" Oct 07 20:08:58 crc kubenswrapper[4825]: I1007 20:08:58.940474 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-bb67dff7d-fcd7m_90e787f1-6fb5-4827-b024-89aeb27ca750/manager/0.log" Oct 07 20:08:59 crc kubenswrapper[4825]: I1007 20:08:59.132453 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7c9df698c8-5bgs4_51adc395-c4fb-43b7-a152-871a4b65a832/webhook-server/0.log" Oct 07 20:08:59 crc kubenswrapper[4825]: I1007 20:08:59.247150 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-x5hwq_e705456a-fdcd-4d7e-b3e9-0146cf587db8/kube-rbac-proxy/0.log" Oct 07 20:08:59 crc kubenswrapper[4825]: I1007 20:08:59.763413 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-x5hwq_e705456a-fdcd-4d7e-b3e9-0146cf587db8/speaker/0.log" Oct 07 20:08:59 crc kubenswrapper[4825]: I1007 20:08:59.865423 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9q88_d7287c8b-6db9-4ec7-b7a0-52fd36aec363/frr/0.log" Oct 07 20:09:15 crc kubenswrapper[4825]: I1007 20:09:15.375517 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t_e49fd630-5fe7-4b4a-a455-9f53417191bf/util/0.log" Oct 07 20:09:15 crc kubenswrapper[4825]: I1007 20:09:15.570434 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t_e49fd630-5fe7-4b4a-a455-9f53417191bf/util/0.log" Oct 07 20:09:15 crc kubenswrapper[4825]: I1007 20:09:15.580019 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t_e49fd630-5fe7-4b4a-a455-9f53417191bf/pull/0.log" Oct 07 20:09:15 crc kubenswrapper[4825]: I1007 20:09:15.594647 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t_e49fd630-5fe7-4b4a-a455-9f53417191bf/pull/0.log" Oct 07 20:09:15 crc kubenswrapper[4825]: I1007 20:09:15.775904 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t_e49fd630-5fe7-4b4a-a455-9f53417191bf/util/0.log" Oct 07 20:09:15 crc kubenswrapper[4825]: I1007 20:09:15.794375 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t_e49fd630-5fe7-4b4a-a455-9f53417191bf/extract/0.log" Oct 07 20:09:15 crc kubenswrapper[4825]: I1007 20:09:15.825171 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gzl4t_e49fd630-5fe7-4b4a-a455-9f53417191bf/pull/0.log" Oct 07 20:09:15 crc kubenswrapper[4825]: I1007 20:09:15.942516 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mmrrt_d4d9fecf-f52d-4758-9a79-9c80afa25e80/extract-utilities/0.log" Oct 07 20:09:16 crc kubenswrapper[4825]: I1007 20:09:16.132399 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mmrrt_d4d9fecf-f52d-4758-9a79-9c80afa25e80/extract-utilities/0.log" Oct 07 20:09:16 crc kubenswrapper[4825]: I1007 20:09:16.138457 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mmrrt_d4d9fecf-f52d-4758-9a79-9c80afa25e80/extract-content/0.log" Oct 07 20:09:16 crc kubenswrapper[4825]: I1007 20:09:16.159383 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mmrrt_d4d9fecf-f52d-4758-9a79-9c80afa25e80/extract-content/0.log" Oct 07 20:09:16 crc kubenswrapper[4825]: I1007 20:09:16.286560 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mmrrt_d4d9fecf-f52d-4758-9a79-9c80afa25e80/extract-utilities/0.log" Oct 07 20:09:16 crc kubenswrapper[4825]: I1007 20:09:16.320287 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mmrrt_d4d9fecf-f52d-4758-9a79-9c80afa25e80/extract-content/0.log" Oct 07 20:09:16 crc kubenswrapper[4825]: I1007 20:09:16.483724 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gdls5_707f4130-70a2-4161-80c6-d5767bf6752e/extract-utilities/0.log" Oct 07 20:09:16 crc kubenswrapper[4825]: I1007 20:09:16.689192 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gdls5_707f4130-70a2-4161-80c6-d5767bf6752e/extract-utilities/0.log" Oct 07 20:09:16 crc kubenswrapper[4825]: I1007 20:09:16.769504 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gdls5_707f4130-70a2-4161-80c6-d5767bf6752e/extract-content/0.log" Oct 07 20:09:16 crc kubenswrapper[4825]: I1007 20:09:16.822970 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gdls5_707f4130-70a2-4161-80c6-d5767bf6752e/extract-content/0.log" Oct 07 20:09:16 crc kubenswrapper[4825]: I1007 20:09:16.851441 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mmrrt_d4d9fecf-f52d-4758-9a79-9c80afa25e80/registry-server/0.log" Oct 07 20:09:16 crc kubenswrapper[4825]: I1007 20:09:16.946396 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gdls5_707f4130-70a2-4161-80c6-d5767bf6752e/extract-utilities/0.log" Oct 07 20:09:16 crc kubenswrapper[4825]: I1007 20:09:16.985557 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gdls5_707f4130-70a2-4161-80c6-d5767bf6752e/extract-content/0.log" Oct 07 20:09:17 crc kubenswrapper[4825]: I1007 20:09:17.148677 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4_3b2c30cb-8398-4238-a5cd-eb2ee78812a1/util/0.log" Oct 07 20:09:17 crc kubenswrapper[4825]: I1007 20:09:17.370379 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4_3b2c30cb-8398-4238-a5cd-eb2ee78812a1/pull/0.log" Oct 07 20:09:17 crc kubenswrapper[4825]: I1007 20:09:17.373895 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gdls5_707f4130-70a2-4161-80c6-d5767bf6752e/registry-server/0.log" Oct 07 20:09:17 crc kubenswrapper[4825]: I1007 20:09:17.386002 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4_3b2c30cb-8398-4238-a5cd-eb2ee78812a1/util/0.log" Oct 07 20:09:17 crc kubenswrapper[4825]: I1007 20:09:17.395321 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4_3b2c30cb-8398-4238-a5cd-eb2ee78812a1/pull/0.log" Oct 07 20:09:17 crc kubenswrapper[4825]: I1007 20:09:17.581148 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4_3b2c30cb-8398-4238-a5cd-eb2ee78812a1/util/0.log" Oct 07 20:09:17 crc kubenswrapper[4825]: I1007 20:09:17.627321 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4_3b2c30cb-8398-4238-a5cd-eb2ee78812a1/extract/0.log" Oct 07 20:09:17 crc kubenswrapper[4825]: I1007 20:09:17.648017 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835crfxw4_3b2c30cb-8398-4238-a5cd-eb2ee78812a1/pull/0.log" Oct 07 20:09:17 crc kubenswrapper[4825]: I1007 20:09:17.779777 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-kgrmp_69920aad-eedb-4eca-887a-8f3225bff52b/marketplace-operator/0.log" Oct 07 20:09:17 crc kubenswrapper[4825]: I1007 20:09:17.830386 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jzchl_1a126e8e-4603-49da-a888-c12dba592af6/extract-utilities/0.log" Oct 07 20:09:17 crc kubenswrapper[4825]: I1007 20:09:17.976769 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jzchl_1a126e8e-4603-49da-a888-c12dba592af6/extract-utilities/0.log" Oct 07 20:09:17 crc kubenswrapper[4825]: I1007 20:09:17.986422 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jzchl_1a126e8e-4603-49da-a888-c12dba592af6/extract-content/0.log" Oct 07 20:09:18 crc kubenswrapper[4825]: I1007 20:09:18.005284 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jzchl_1a126e8e-4603-49da-a888-c12dba592af6/extract-content/0.log" Oct 07 20:09:18 crc kubenswrapper[4825]: I1007 20:09:18.159017 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jzchl_1a126e8e-4603-49da-a888-c12dba592af6/extract-content/0.log" Oct 07 20:09:18 crc kubenswrapper[4825]: I1007 20:09:18.195814 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xq5h6_cbac88b1-e1d0-432b-a57d-b73910086aa8/extract-utilities/0.log" Oct 07 20:09:18 crc kubenswrapper[4825]: I1007 20:09:18.232958 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jzchl_1a126e8e-4603-49da-a888-c12dba592af6/extract-utilities/0.log" Oct 07 20:09:18 crc kubenswrapper[4825]: I1007 20:09:18.300304 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jzchl_1a126e8e-4603-49da-a888-c12dba592af6/registry-server/0.log" Oct 07 20:09:18 crc kubenswrapper[4825]: I1007 20:09:18.424618 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xq5h6_cbac88b1-e1d0-432b-a57d-b73910086aa8/extract-content/0.log" Oct 07 20:09:18 crc kubenswrapper[4825]: I1007 20:09:18.426538 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xq5h6_cbac88b1-e1d0-432b-a57d-b73910086aa8/extract-utilities/0.log" Oct 07 20:09:18 crc kubenswrapper[4825]: I1007 20:09:18.450398 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xq5h6_cbac88b1-e1d0-432b-a57d-b73910086aa8/extract-content/0.log" Oct 07 20:09:18 crc kubenswrapper[4825]: I1007 20:09:18.601913 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xq5h6_cbac88b1-e1d0-432b-a57d-b73910086aa8/extract-content/0.log" Oct 07 20:09:18 crc kubenswrapper[4825]: I1007 20:09:18.640371 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xq5h6_cbac88b1-e1d0-432b-a57d-b73910086aa8/extract-utilities/0.log" Oct 07 20:09:19 crc kubenswrapper[4825]: I1007 20:09:19.218979 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xq5h6_cbac88b1-e1d0-432b-a57d-b73910086aa8/registry-server/0.log" Oct 07 20:11:05 crc kubenswrapper[4825]: I1007 20:11:05.708538 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 20:11:05 crc kubenswrapper[4825]: I1007 20:11:05.709085 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 20:11:11 crc kubenswrapper[4825]: I1007 20:11:11.320852 4825 generic.go:334] "Generic (PLEG): container finished" podID="1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5" containerID="ad17529029b0a269b0b45e288592a2807c71db94fc1c7a4bc57c45e5243d4e51" exitCode=0 Oct 07 20:11:11 crc kubenswrapper[4825]: I1007 20:11:11.320976 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m2kj9/must-gather-6qf5b" event={"ID":"1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5","Type":"ContainerDied","Data":"ad17529029b0a269b0b45e288592a2807c71db94fc1c7a4bc57c45e5243d4e51"} Oct 07 20:11:11 crc kubenswrapper[4825]: I1007 20:11:11.322678 4825 scope.go:117] "RemoveContainer" containerID="ad17529029b0a269b0b45e288592a2807c71db94fc1c7a4bc57c45e5243d4e51" Oct 07 20:11:11 crc kubenswrapper[4825]: I1007 20:11:11.450459 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m2kj9_must-gather-6qf5b_1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5/gather/0.log" Oct 07 20:11:23 crc kubenswrapper[4825]: I1007 20:11:23.229889 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-m2kj9/must-gather-6qf5b"] Oct 07 20:11:23 crc kubenswrapper[4825]: I1007 20:11:23.230692 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-m2kj9/must-gather-6qf5b" podUID="1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5" containerName="copy" containerID="cri-o://9b7d170be11a83364ec9bd760fca9b0b22c04b42648d75b84ed0d83a9718b7e6" gracePeriod=2 Oct 07 20:11:23 crc kubenswrapper[4825]: I1007 20:11:23.238391 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-m2kj9/must-gather-6qf5b"] Oct 07 20:11:23 crc kubenswrapper[4825]: I1007 20:11:23.439052 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m2kj9_must-gather-6qf5b_1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5/copy/0.log" Oct 07 20:11:23 crc kubenswrapper[4825]: I1007 20:11:23.439597 4825 generic.go:334] "Generic (PLEG): container finished" podID="1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5" containerID="9b7d170be11a83364ec9bd760fca9b0b22c04b42648d75b84ed0d83a9718b7e6" exitCode=143 Oct 07 20:11:23 crc kubenswrapper[4825]: I1007 20:11:23.681908 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m2kj9_must-gather-6qf5b_1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5/copy/0.log" Oct 07 20:11:23 crc kubenswrapper[4825]: I1007 20:11:23.682662 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m2kj9/must-gather-6qf5b" Oct 07 20:11:23 crc kubenswrapper[4825]: I1007 20:11:23.725267 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5-must-gather-output\") pod \"1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5\" (UID: \"1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5\") " Oct 07 20:11:23 crc kubenswrapper[4825]: I1007 20:11:23.725496 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw45b\" (UniqueName: \"kubernetes.io/projected/1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5-kube-api-access-sw45b\") pod \"1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5\" (UID: \"1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5\") " Oct 07 20:11:23 crc kubenswrapper[4825]: I1007 20:11:23.739536 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5-kube-api-access-sw45b" (OuterVolumeSpecName: "kube-api-access-sw45b") pod "1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5" (UID: "1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5"). InnerVolumeSpecName "kube-api-access-sw45b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 20:11:23 crc kubenswrapper[4825]: I1007 20:11:23.827855 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw45b\" (UniqueName: \"kubernetes.io/projected/1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5-kube-api-access-sw45b\") on node \"crc\" DevicePath \"\"" Oct 07 20:11:23 crc kubenswrapper[4825]: I1007 20:11:23.902361 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5" (UID: "1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 20:11:23 crc kubenswrapper[4825]: I1007 20:11:23.929866 4825 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 07 20:11:24 crc kubenswrapper[4825]: I1007 20:11:24.452416 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m2kj9_must-gather-6qf5b_1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5/copy/0.log" Oct 07 20:11:24 crc kubenswrapper[4825]: I1007 20:11:24.453210 4825 scope.go:117] "RemoveContainer" containerID="9b7d170be11a83364ec9bd760fca9b0b22c04b42648d75b84ed0d83a9718b7e6" Oct 07 20:11:24 crc kubenswrapper[4825]: I1007 20:11:24.453406 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m2kj9/must-gather-6qf5b" Oct 07 20:11:24 crc kubenswrapper[4825]: I1007 20:11:24.486991 4825 scope.go:117] "RemoveContainer" containerID="ad17529029b0a269b0b45e288592a2807c71db94fc1c7a4bc57c45e5243d4e51" Oct 07 20:11:24 crc kubenswrapper[4825]: E1007 20:11:24.609469 4825 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a7fc474_058a_44f5_9dd6_75b1b3e6cdb5.slice\": RecentStats: unable to find data in memory cache]" Oct 07 20:11:25 crc kubenswrapper[4825]: I1007 20:11:25.809889 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5" path="/var/lib/kubelet/pods/1a7fc474-058a-44f5-9dd6-75b1b3e6cdb5/volumes" Oct 07 20:11:35 crc kubenswrapper[4825]: I1007 20:11:35.708925 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 20:11:35 crc kubenswrapper[4825]: I1007 20:11:35.710594 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 20:11:50 crc kubenswrapper[4825]: I1007 20:11:50.739915 4825 scope.go:117] "RemoveContainer" containerID="7d6d95a4abf7b430388e75733ee4afbab8d5320a85c38548d5a3c8fe89a23220" Oct 07 20:12:05 crc kubenswrapper[4825]: I1007 20:12:05.708443 4825 patch_prober.go:28] interesting pod/machine-config-daemon-b6jcs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 20:12:05 crc kubenswrapper[4825]: I1007 20:12:05.709153 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 20:12:05 crc kubenswrapper[4825]: I1007 20:12:05.709213 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" Oct 07 20:12:05 crc kubenswrapper[4825]: I1007 20:12:05.710366 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e132f4b7438804489e938ee5565a903c6ef55634d044c2bf923546dacce3dc25"} pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 20:12:05 crc kubenswrapper[4825]: I1007 20:12:05.710619 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" podUID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerName="machine-config-daemon" containerID="cri-o://e132f4b7438804489e938ee5565a903c6ef55634d044c2bf923546dacce3dc25" gracePeriod=600 Oct 07 20:12:05 crc kubenswrapper[4825]: E1007 20:12:05.756133 4825 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda57a780f_aa1f_4e0f_9a90_5e6a70f89d18.slice/crio-conmon-e132f4b7438804489e938ee5565a903c6ef55634d044c2bf923546dacce3dc25.scope\": RecentStats: unable to find data in memory cache]" Oct 07 20:12:05 crc kubenswrapper[4825]: I1007 20:12:05.912283 4825 generic.go:334] "Generic (PLEG): container finished" podID="a57a780f-aa1f-4e0f-9a90-5e6a70f89d18" containerID="e132f4b7438804489e938ee5565a903c6ef55634d044c2bf923546dacce3dc25" exitCode=0 Oct 07 20:12:05 crc kubenswrapper[4825]: I1007 20:12:05.912725 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerDied","Data":"e132f4b7438804489e938ee5565a903c6ef55634d044c2bf923546dacce3dc25"} Oct 07 20:12:05 crc kubenswrapper[4825]: I1007 20:12:05.912777 4825 scope.go:117] "RemoveContainer" containerID="680a2d0ba7689f6f1fdd7b27d443d124ceafcc553f71498a28de05e0b81f2962" Oct 07 20:12:06 crc kubenswrapper[4825]: I1007 20:12:06.927167 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b6jcs" event={"ID":"a57a780f-aa1f-4e0f-9a90-5e6a70f89d18","Type":"ContainerStarted","Data":"81e93479e3b4182b2af956a394659b3a09c2260c64467cf749e931144644d09b"}